• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Redfall PC - DF Tech Review - Another Unacceptably Poor Port

adamsapple

Or is it just one of Phil's balls in my throat?
We weren't too impressed with the Xbox version of Redfall, but the PC version has its own range of unacceptably poor issues - from traversal stutter to non-descript, non-helpful settings menus, to visual bugs and terrible CPU core utilisation. Support for all modern upscalers and some kind of shader precompilation pass help matters, but this is still far from the kind of quality we'd expect from a major triple-A release.



 

SlimySnake

Flashless at the Golden Globes
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space (Frostbite)
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
- "Between this and Jedi Survivor I'm not sure which is worse" - Alex
- Incredibly barebones graphical menu with only default UE4 presets.
- At least the game supports multiple image up-scaling options
- But XeSS and DLSS can cause bloom related issues

- Redfall does *NOT* have Shader Comp stutter.
- But it does suffer from traversal stutter which makes the game "kinda unplayable on mid-ranged PCs"
- GPU goes under-utilized even at max settings. Threads are left unused.

- Ryzen 3600 is constantly CPU limited and extremely erratic frame times
- On this CPU, DF recommends using a 30 FPS cap (half-refresh sync) for a better experience
- But the above only works with Low settings. On High preset, it incurs lots of drops from 30 FPS

- "Epic setting shadows look worse than games that came out 15 years ago"
- Not recommended for anyone except for folks with modern high-end hardware, even then expect traversal stutter.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.

Dead Space remake is Frostbite IIRC
 

Fess

Member
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.
Yeah. It’s honestly extremely concerning, especially since that’s the big engine everybody seems to be using going forward, including likely all MS studios besides maybe id.
 
Last edited:

feynoob

Banned
Embarrassed Shame GIF
 

Fafalada

Fafracer forever
It's really weird, I don't recall UE4 games being this bad a few years ago
Traversal stutter has always been a thing in UE games(i still remember the stuttery mess bioshock infinite was on PC) and not without reasons, it's an issue embeded deeply into engine architecture.
Shader issues existed before too, it was just mostly hidden by driver stack on older directx.
 

SlimySnake

Flashless at the Golden Globes
It's really weird, I don't recall UE4 games being this bad a few years ago, same for Frostbite. What has changed in the last couple of years that an otherwise reliable engine like UE4 is producing shit ports on PC.
Honestly, shader comp and traversal stutter issues have plagued almost every UE4 game i played last gen. Days Gone had that horrendous bug in the second world that caused the framerate to crash into single digits at times. It also had traversal stutters. Jedi Fallen Order shipped in a rough state too but was patched very quickly. The stutters still remain though. FF7 remake had those hilarious texture loading issues. DoorGate IIRC.

That said, Dead Island 2 just came out and is as polished as you can get. Days Gone on PC was a fantastic port so maybe its just down to games shipping in poor state.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
The most interesting section in the video was the comparison between Redfall, Jedi Survivor and Cyberpunk 2077 with regards to CPU utilization. It's clear that UE4 games don't use the CPU properly, performance doesn't scale properly with additional cores. In fact, on top of the line end Intel CPUs (12900K, 13900K) you can even get worse performance compared to system with 6 regular cores. Shocking.
 

Buggy Loop

Gold Member
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space (Frostbite)
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.

I fear what CD project red will do with UE 5... Very sad because their internal engine now in Cyberpunk 2077 with path tracing is amazing. Performances is solid, no shader compilation at start, no stutters or very little of it and it looks insane. UE5 will be a regression, except for Nanite.
 

StreetsofBeige

Gold Member
Too many developers using Unreal Engine and don't know what they're doing. Star Wars, Redfall, etc. The Coalition are the only ones I can think of outside of Epic that have mastered it.
It's really weird, I don't recall UE4 games being this bad a few years ago, same for Frostbite. What has changed in the last couple of years that an otherwise reliable engine like UE4 is producing shit ports on PC.
I dont know anything about UE, but I thought UE is one of those mainstream kind of tools everyone kind of knows how to use.

Sort of like during the 360/PS3 era it seemed tons of companies used UE for games and it worked fine enough.
 

Mister Wolf

Gold Member
I dont know anything about UE, but I thought UE is one of those mainstream kind of tools everyone kind of knows how to use.

Sort of like during the 360/PS3 era it seemed tons of companies used UE for games and it worked fine enough.

Most of these developers aren't even smart enough to include a shader precompilation before playing be it optional or mandatory. It's the same in every industry including my own. Just because you have the title doesn't mean you know how to do the job properly.
 

Fafalada

Fafracer forever
DF said many times that the most comparable CPU to Z2 in consoles 3600.
4700S is literally the PS5 CPU (minus a working GPU), it doesn't get any closer than that. And the kind of design modifications shown there (like lower latency/faster SIMD ops) are fairly typical of a console CPU, optimize for console specific use cases.

That said - it doesn't mean Series CPUs are comparable/same - DF observations are based on external factors (clock speed, number of cores, cache sizes etc) not internal modifications we see in that chart. And we already know two console vendors made different changes on similar level to their respective GPUs.
 

SlimySnake

Flashless at the Golden Globes
I fear what CD project red will do with UE 5... Very sad because their internal engine now in Cyberpunk 2077 with path tracing is amazing. Performances is solid, no shader compilation at start, no stutters or very little of it and it looks insane. UE5 will be a regression, except for Nanite.

Look at how poorly star wars runs with RT on. drops below 900p in 30 fps mode. 600p in performance modes on 10-12 tflops machines. A 63 tflops GPU is what you need to run Path tracing at native 4k 15 fps. With DLSS needed just to get it running above 30 fps. Its just not possible for these current gen consoles. I think i told you that my 3080 was only running it at 40 fps using dlss balanced or 860p internal resolution.

Even Psycho RT was around 55 fps on my card using DLSS performance at 4k so 1080p. And this card is 2x more powerful than the PS5 in ray tracing. I dont think these consoles with shit RT performance, lack of dlss and 10 tflops of raw gpu power can handle ray tracing at anything above 1440p 30 fps.

Even lumens is targeting 1080p 30 fps on consoles in its hardware accelerated mode that uses Ray tracing. Software is 1440p 30 fps. Sadly consoles dictate what they choose.

P.S Witcher 3 performance is absolutely trash in its RT mode. Just crashes as soon as i go into any town, city or settlement. There is a CPU bound game if i ever saw one. I think regardless of the engine, devs need to wrap their heads around multithreading. they have 16 fucking threads on consoles now running at 3.5 ghz. go nuts.
 

SABRE220

Member
It's really weird, I don't recall UE4 games being this bad a few years ago, same for Frostbite. What has changed in the last couple of years that an otherwise reliable engine like UE4 is producing shit ports on PC.
Hell even frostbite was amazing until bf5, it was legit industry leading tech doing stuff no one replicated then next gen you get the new battlefield etc..
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Does it have VRR? I couldn't find an option and it was tearing like crazy with vsync off although maybe that was dips below VRR range.
 

Bojji

Gold Member
4700S is literally the PS5 CPU (minus a working GPU), it doesn't get any closer than that. And the kind of design modifications shown there (like lower latency/faster SIMD ops) are fairly typical of a console CPU, optimize for console specific use cases.

That said - it doesn't mean Series CPUs are comparable/same - DF observations are based on external factors (clock speed, number of cores, cache sizes etc) not internal modifications we see in that chart. And we already know two console vendors made different changes on similar level to their respective GPUs.

What you can say with absolute certainty that console CPUs for sure aren't faster than 3600, and if this CPU performs like that Xbox CPU won't be much better even with lower level DX12 version on console.
 

LoveCake

Member
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space (Frostbite)
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.
Hogwarts didn't have nearly as many issues as the other games mentioned, I don't think it warrants a place on this list.
 

SlimySnake

Flashless at the Golden Globes
Hogwarts didn't have nearly as many issues as the other games mentioned, I don't think it warrants a place on this list.
Dude Hogwarts had horrific ram usage where it was damn near unplayable on 16 GB ram. I have never seen ram usage go up to 25 GB before. the stutters were extraordinary and people were posting config file fixes to get performance to be more stable.

I eventually turned off RT and had a smooth experience after a couple of patches but they just released a patch today that has over 500 fixes. 500.
 

Braag

Member
It's honestly baffling how awful the CPU utilization is in recent PC games. It's like we've regressed to 2010 when games barely were able to use more than 2-4 cores.
Now CPU's have like 8-24 cores a lot of them and devs just refuse to use them. It's clear in Alex's benchmarks that a game like CP2077 that properly uses your CPU scales with the amount of cores. I actually noticed this myself when I last year upgraded from i7-10700K to i7-13700K. I got a pretty significant fps boost in Cyberpunk.
I hope bigger releases of the year like Starfield don't suffer from this.
 

SlimySnake

Flashless at the Golden Globes
It's honestly baffling how awful the CPU utilization is in recent PC games. It's like we've regressed to 2010 when games barely were able to use more than 2-4 cores.
Now CPU's have like 8-24 cores a lot of them and devs just refuse to use them. It's clear in Alex's benchmarks that a game like CP2077 that properly uses your CPU scales with the amount of cores. I actually noticed this myself when I last year upgraded from i7-10700K to i7-13700K. I got a pretty significant fps boost in Cyberpunk.
I hope bigger releases of the year like Starfield don't suffer from this.
Yep. I went from i7-8700 to i7-11700k and it was a massive difference in Cyberpunk and nothing else.....
 

HL3.exe

Member
Multi-threaded CPU optimization is still really really hard. Remember, games are mostly based around input from the player, and the game-logic/simulation responding correctly. Because of this, games are deterministic, meaning: they work best in logical order, mostly on a single game-logic thread. This main thread (combined with the render thread) sends out jobs to the other threads and they are most focused on aesthetic elements that have less dependency on the main game-logic simulation. (Think texture calls, particle calculations, non-gameplay related physics, etc)

This is what make multicore programming -specifically for games- still really difficult to this day. And I bet because they use UE4 -that is not a in-house engine- makes it way more tricky on low-level optimization, because they're dependent on the experience and documentation from Epic.
 
Last edited:

Spyxos

Gold Member
For those keeping track:

1) Gotham Knights
2) Callisto Protocol
3) Dead Space (Frostbite)
4) Hogwarts
5) Star Wars
6) Redfall

Guess what all these games have in common?

UE4.

All of a sudden, Im not too thrilled about UE5. The traversal stutter is present in Fortnite and Matrix was CPU bound as well just like all these games.
Yes what happened? Wasn't Unreal engine 4 once good? Why can't anyone handle it all of a sudden?
 

winjer

Member
Yes what happened? Wasn't Unreal engine 4 once good? Why can't anyone handle it all of a sudden?

UE4 always had these issues. The traversal stuttering has been around for many years, but with game pushing more detail, what was once a hiccup, it now an earthquake.
The issue with shader compilation, was lessened by DX11 high level API. With DX12, performance improved, but it requires the devs to gather PSOs and set the right commands to do a shader compilation when the game starts.
The issue is that only a few devs know and care doing this on PC.
And the issues with CPU threads also were a problem, even more with DX11. But once again, with games now pushing more detail, what was once a small problem is now a big one.
 
Last edited:

Spyxos

Gold Member
UE4 always had these issues. The traversal stuttering has been around for many years, but with game pushing more detail, what was once a hiccup, it now an earthquake.
The issue with shader compilation, was lessened by DX11 high level API. With DX12, performance improved, but it requires the devs to gather PSOs and set the right commands to do a shader compilation when the game starts.
The issue is that only a dew devs know and care doing this on PC.
And the issues with CPU threads also were a problem, even more with DX11. But once again, with games now pushing more detail, what was once a small problem is now a big one.
Is the UE4 still being developed at all? Now that the first UE5 games are coming. It seems as if Epic is not interested in the bad condition of the engine.
 

StreetsofBeige

Gold Member
Multi-threaded CPU optimization is still really really hard. Remember, games are mostly based around input from the player, and the game-logic/simulation responding correctly. Because of this, games are deterministic, meaning: they work best in logical order, mostly on a single game-logic thread. This main thread (combined with the render thread) sends out jobs to the other threads and they are most focused on aesthetic elements that have less dependency on the main game-logic simulation. (Think texture calls, particle calculations, non-gameplay related physics, etc)

This is what make multicore programming -specifically for games- still really difficult to this day. And I bet because they use UE4 -that is not a in-house engine- makes it way more tricky on low-level optimization, because they're dependent on the experience and documentation from Epic.
I dont know anything about cpus and threads, but if it's hard for gaming, what kind of software would adding more threads be an automatic kind of boost?

If I do giant spreadhseets all day, would that kind of program be a good example of getting boosts without the programmers reworking stuff?
 
Top Bottom