• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Lords of the Fallen quality and perf modes: 1440/30 and 1080/60

octiny

Banned
The 4090... a $1600 GPU. Runs Remanant (another UE5 game) at 2160p native at 45fps. And that GPU is paired with hardware that is well over $2400 in total on the test rig.

At max settings.

Console versions use medium settings (at best) in unlocked performance mode @ 720p native resolution (upscaled terribly to 1440p) to maintain a 30-60 fps average. Where it averages out to about 45 fps.

With that said, you're looking at over an 9x increase in pixels, much higher settings w/ the same average FPS on a $2800 4090 PC vs a $500 (5.6x cheaper) or $400 console (7x cheaper).

There's plenty of better comparisons you could of made to make consoles look better, so not sure why you chose this one specifically. Most likely cause this game will end up similar I'm guessing?
 

Akuji

Member
My epson projector is a 1080p panel with eshift to 4k. 1080p native aint wo bad for me. Need to save up asap for the New one. Still 1080p panel but better shift and 120hz. Gonna be gerat with the pro consoles. As for Pro consoles running this at 4k60. No way. That would mean 4x the Performance over the 1080p 60 Mode on the gpu side. Were not getting a 4x increase. Like the ps4 pro i expect a 2.x increase.
 
I’m starting to think 4k was a mistake.
It was. At least, how soon it happened, I feel, was a mistake. 4k put a halt to a lot of things that were supposed to progress in video games because resources had to be allocated to make sure that games looked and ran good on displays with a much, much higher pixel count than 1080p. Ray Tracing only made the situation worse, but obviously it will be good to have in the long run. Combining both and then expecting these games to run at 4k up to 60fps with ray tracing, without any issues and to look comparable to Movie CGI on machines that only cost 500 is very, very wishful thinking.

That's why the best thing to come from this generation (outside of SSDs) has been two modes: fidelity mode and performance mode. The problem is that lately devs keep trying to apply part of that fidelity to performance in regard to performance mode, so we keep ending up with very inconsistent performance in something that is literally called performance mode. However, most of the people like myself who play on performance mode value 60fps over everything, not fidelity. So in theory it doesn't make sense that the inconsistencies keep happening.

It should have been a gradual leap and 1440p should have been one of the stepping stones on this gradual incline. That rumor about Sony putting '8K capable' on the PS5 pro box, and the eventual marketing push for 8k displays, is only going to make this entire problem worse.
 

Mr.Phoenix

Member
At max settings.

Console versions use medium settings (at best) in unlocked performance mode @ 720p native resolution (upscaled terribly to 1440p) to maintain a 30-60 fps average. Where it averages out to about 45 fps.

With that said, you're looking at over an 9x increase in pixels, much higher settings w/ the same average FPS on a $2800 4090 PC vs a $500 (5.6x cheaper) or $400 console (7x cheaper).

There's plenty of better comparisons you could of made to make consoles look better, so not sure why you chose this one specifically. Most likely cause this game will end up similar I'm guessing?
Sheesh, clearly you are missing the point.

And that was made abundantly clear when you suggested that I was trying to make consoles look better. This is not about making anything look better. What I said applies to consoles and PC alike and has been going on since games existed.

I don't know what you are insecure about or have issues with, but please, leave me outta it. Trying to turn what I said or what I was saying into some sort of vs argument..

My epson projector is a 1080p panel with eshift to 4k. 1080p native aint wo bad for me. Need to save up asap for the New one. Still 1080p panel but better shift and 120hz. Gonna be gerat with the pro consoles. As for Pro consoles running this at 4k60. No way. That would mean 4x the Performance over the 1080p 60 Mode on the gpu side. Were not getting a 4x increase. Like the ps4 pro i expect a 2.x increase.
I am on a lens shift projector too. I do think this would be the last projector I own though. The fact that right now we can get a 98-inch native 2160p 120hz TV for under $6,000... is all the writing on the wall I need.

I mean between what I spent on my projector and 120-inch screen, it would have been cheaper to get the TV.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
At max settings.

Console versions use medium settings (at best) in unlocked performance mode @ 720p native resolution (upscaled terribly to 1440p) to maintain a 30-60 fps average. Where it averages out to about 45 fps.

With that said, you're looking at over an 9x increase in pixels, much higher settings w/ the same average FPS on a $2800 4090 PC vs a $500 (5.6x cheaper) or $400 console (7x cheaper).

There's plenty of better comparisons you could of made to make consoles look better, so not sure why you chose this one specifically. Most likely cause this game will end up similar I'm guessing?
But that's his point, isn't it? I don't think he's trying to make the console look better. He argues that since even the monstrous 4090 doesn't come close to running this game at 4K max settings, why are people surprised that the consoles sporting a GPU 1/4th of its performance are dropping to 1080p to maintain 60fps in a UE5 game?
 
Last edited:

Kenneth Haight

Gold Member
I'm a PC gamer, I play everything at native 4K. Heck, on some games I'll supersample to 8K & use DLSS.
Tom Hanks Fancy GIF

Enjoy your console experience.
Kamala Harris Yes GIF by The Democrats
 

octiny

Banned
Sheesh, clearly you are missing the point.

And that was made abundantly clear when you suggested that I was trying to make consoles look better. This is not about making anything look better. What I said applies to consoles and PC alike and has been going on since games existed.

I don't know what you are insecure about or have issues with, but please, leave me outta it. Trying to turn what I said or what I was saying into some sort of vs argument..

Ah, I see 😏

So, why did you bring up the 4090 metrics out of nowhere then use it as a comparison to consoles in context of PS5 being outdated by 27-28 (via person you quoted), on top of blurting out how much it would cost in comparison.

I simply did the math for you.

You are the one that brought PC into the equation responding to the other poster. I'm just pointing out your flawed logic.

You're also factually incorrect for saying the equivalent PS5 GPU (6600 XT) would run at 18 fps in Remnant 2. I let you go on that one in my first post, but I'll go ahead & correct you there as well. Mind you, this is @ ultra settings @ native 1080p. For context, PS5 equivalent mode would be quality mode which is 1080p native (upscaled to 1440p) @ 30 fps w/ lower settings.

aMLPT9M.jpg


I hate using GameGPU but they are close enough.

Link

But that's his point, isn't it? I don't think he's trying to make the console look better. He argues that since even the monstrous 4090 doesn't come close to running this game at 4K max settings, why are people surprised that the consoles sporting a GPU 1/4th of its performance are dropping to 1080p to maintain 60fps in a UE5 game?

Yeah, I don't see it as that. Otherwise poster wouldn't of made the comparison to the PS5 GPU equivalent saying it ran at just 18fps (surprise, it doesn't).

Regardless, performance mode is native 720p fluctuating between 30-60fps while quality mode (native 1080p) is 30fps for the Remnant 2. PS5 version performs 50%-75% worse than the PS5 equivlant GPU (around a 6600XT to a 6700 non-XT) @ max settings. Although, poster later confirmed he was talking about the 6700 XT oddly enough for that "18 fps" statement. The PS5 performs 100% slower in that case. 100%-175% if we lower everything to PS5 equivalent medium settings (2 tiers down from maxed).

It was easy to see through the innuendo. It's not the first time I've seen these types of posts from said poster. I just decided to correct the statements being made this time. It is what it is though.

Also, I'd like to add that the 4090 certainly can run this game 4K native @ maxed settings, just depends if the user wants 40-50 fps native vs 70-80 DLSS quality. Patches have also given this game a nice boost on PC throughout all GPU tiers. Not a game I personally want to play though, so won't bother benchmarking it myself to see how much it improved on the native 40-50fps at launch.
 
Last edited:

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
The question is what do we get out of a ps5 pro.

The PS4 Pro got us from like 1080p to 1800p or 4K CBR, at 30fps. You could see that difference, even on a 1080p TV where you got this very high IQ from super sampling. What will the PS5 Pro do, 1080p/60->1440p/60? Is that worth $600 or $700 or whatever they are going to charge? How noticeable is that going to be, especially when the algorithms are better today than they were in 2017?

I know it's too soon to tell, but I still remain skeptical of what this is going to do outside of bullet points and digital foundry write ups, especially when the PS5 Pro is still going to not compare to an advanced GPU (that again wont be doing this native either).

Yes it's worth it for $500
 

Mr.Phoenix

Member
Ah, I see 😏

So, why did you bring up the 4090 metrics out of nowhere than use it as a comparison to consoles in context of PS5 being outdated by 27-28 (via person you quoted), on top of blurting out how much it would cost in comparison.

I simply did the math for you.

You are the one that brought PC into the equation responding to the other poster. I'm just pointing out your flawed logic.

You're also factually incorrect for saying the equivalent PS5 GPU (6600 XT) would run at 18 fps in Remnant 2. I let you go on that one in my first post, but I'll go ahead & correct you there as well. Mind you, this is at Ultra settings @ native 1080p.

aMLPT9M.jpg


I hate using GameGPU they are close enough.

Link



Yeah, I don't see it as that. Otherwise poster wouldn't of made the comparison to the PS5 GPU equivalent saying it ran at 18fps (it doesn't).

It is what it is though.
since you want to be anal about shit... I will indulge you. Just this once though,if you still don't get t then thats fine.

I used the 4090 simply because it represents the pinnacle of gaming as we have it today. And I showed what that GPU could manage running another UE5 game at the highest mainstream rez and highest settings. Basically using that as the baseline for ht to expect before we start throwing in stuff like dlss, FSR and TSR...etc.

The GPU I used as an example PS5 equivalent, was actually the 6700xt (a 40CU RDNA2 PC GPU)which as far as wafers go,is the closest relative to a PS5 GPU there is. And that 18fps, was that GPU (6700xt) running the sm remnant game at the same settings the 4090 ran it. So native 2160p and ultra settings. And mind you, the 6700xt is even more powerful than a PS5 GPU.

But I wonder, why would you see me mention a 4090 running a game at 2160p then assume that when I mention whatever I call to be the PS5 equivalent PC part, my comparison would then shift to 1080p?? YOu seem more interested in being right than in actually understanding what is being said.

The point I was making was clear, this engine... UE5, which represents what modern gaming is, would bring the best hardware gaming has to offer to its knees. So what do people really expect from some piece of hardware that is orders of magnitude weaker?

And to further push that point,I talked on how this is something that always happens but for whatever reason seems to be ignored, hardware gets better, be it PC or console hardware. But so does the demands of whatever new gaming engine or gaming feature set the industry is clamoring over. last gen it was rez and physical-based rendering, this gen its RT. Every time we get just enough power to run or do these games justice, some new graphic tech would pop up again bringing everything back down to sub 60fps gaming.

Rinse and repeat.

See? Has nothing to do with consoles...
 
Last edited:

Akuji

Member
I am on a lens shift projector too. I do think this would be the last projector I own though. The fact that right now we can get a 98-inch native 2160p 120hz TV for under $6,000... is all the writing on the wall I need.

I mean between what I spent on my projector and 120-inch screen, it would have been cheaper to get the TV.
Not an option for me. My Center and my subs are behind the screen so i cant put a solid Display in front of them (subs may work but Center def Not)

Its a home cinema first and so picture quality while still somewhat important. Audio is x1000 for me. When i started the room and did test it i learned that i rather have no picture at all then bad Sound. Good Sound makes it so much more immersive. So iam Not gonna switch from my diy speakers ( Center ) to a TV speaker as a Center.
 

Mr.Phoenix

Member
Not an option for me. My Center and my subs are behind the screen so i cant put a solid Display in front of them (subs may work but Center def Not)

Its a home cinema first and so picture quality while still somewhat important. Audio is x1000 for me. When i started the room and did test it i learned that i rather have no picture at all then bad Sound. Good Sound makes it so much more immersive. So iam Not gonna switch from my diy speakers ( Center ) to a TV speaker as a Center.
Haha we are in totally different worlds. My setup is in a somewhat light-controlled living room. So I can't be too heavy on the gear. Using Sony A9 speakers with a sub. I have never been an audio or even a videophile. So I have been very happy with my setup. Forme,a projector was about screen size and had to at least have some sort of surround sound.

As it stands,I am watching TVs catch up on the screen size front. So I cant really justify buying a projector again. Especially considering my first projector and screen over 14 years ago was an Optoma and 92-inch grey screen.
 

hlm666

Member
Well they seem to have shown all the previews on pc like Remnant 2, so we could all read between lines. We have that immortals aveum game coming out this month though so that will be the first game using nanite/lumen/vsm (if we ignore desordre), previews for that seemed to be on pc again.

On a side note seeing we are discussing tsr I don't think it's as good as people think, here are some comparisons in desordre which recently updated to ue5.2 (they also added rtxdi and ser in same update)



 

Dream-Knife

Banned
The GPU I used as an example PS5 equivalent, was actually the 6700xt (a 40CU RDNA2 PC GPU)which as far as wafers go,is the closest relative to a PS5 GPU there is. And that 18fps, was that GPU (6700xt) running the sm remnant game at the same settings the 4090 ran it. So native 2160p and ultra settings. And mind you, the 6700xt is even more powerful than a PS5 GPU.
If you go by CUs, the 6700 non-xt is what you're supposed to compare to.

In reality, the 6600xt is the same TF on paper. Then again, like Cerny said, lower CU and higher clocks are better. So the 6600xt with fewer CUs and a higher clock should outperform the slower and wider PS5.
 

Ceadeus

Member
Lol, consoles under delivered this gen. We were introduced by insane tech videos showing trillion pf zillion of polygon statues and the end of loading time , written 8k 120hz on the box.

3 years later we are still cross gen, already talking about the pro version.
 

sinnergy

Member
Lol, consoles under delivered this gen. We were introduced by insane tech videos showing trillion pf zillion of polygon statues and the end of loading time , written 8k 120hz on the box.

3 years later we are still cross gen, already talking about the pro version.
Yup , this .. it’s amazing and sad at the same time .
 
What's going on? Are Devs trying to use an engine they don't fully understand or can't optimise for? For the sake of using something new and shiney.

Or is UT just has that much shit in it, nothing can run it good.
 
Rumor? It says 8K on the regular PS5 box.

pkh3env.jpg
It says 8K HDR on the Series X box too, they don't have that 8K logo but the box also advertises it. I don't know why some people think that'll be the focus of the Pro, I'm sure the logo will be on that box as well but it surely won't be the focus.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
What's going on? Are Devs trying to use an engine they don't fully understand or can't optimise for? For the sake of using something new and shiney.

Or is UT just has that much shit in it, nothing can run it good.
Lumen is realtime raytracing with reflections and global illumination.
Are you assuming that shits free?

If devs decide to bake the lighting, instead of using realtime GI, these games will run much much better at higher resolutions, but they certainly wont look as good and will take longer to author due to needing to do lightmaps everytime you change something.


P.S Reading too much into internal resolution is basically pointless these days, as reconstruction techniques are really really good.
A good portion of the time if someone doesnt tell you game X runs at sub-1440p you wouldnt actually be able to tell.
 
The UE5 upscaler isn't anywhere close to DLSS in motion so pointless arguing that PC gamers play sub 4k too. Lewis Hamilton drives a car just like I do but the end result is quite different. Not saying DLSS is perfect as it still has very obvious issues in most games but it's definitely the least awful upscaler and better than native a lot of the time as TAA is completely terrible from an IQ pov.

The console versions of Remnant 2 have tons of artifacting when you move the camera. The DF 'looks like 1440p' was when the camera was still which I think most people would agree makes it hard to play the game. Most comparisons of the different upscalers are done from still images and not in action shots so they really flatter the lesser methods.

I'm guessing this is another sub 1080p game then upscaled to 1080p as it has all the bells and whistles of UE5. The visual breakup from sub 1080p upscaling would be fine if we were still on 32inch 1080p screens but on a huge 4k OLED it's very noticeable.

If you disagree my advice is to stay ignorant and don't go looking for the visual issues as it looks like they are here to stay.

I went back to Spider-Man PS5 after playing the PC version as all of the upscalers on PC have visual issues and I remembered thinking that the PS5 version looked clean as hell..well the reality is the PS5 version is much worse than the PC version and now that I'm used to seeing this shit it now stands out like a sore thumb. Definitely stay in the dark if you can.
 

Bojji

Member
From what I understand it's native 1080 reconstructed/upscaled to 2160p (in P mode) but some people here really got me thinking it could be something much lower than 1080p (based on other UE5 game)...

Super low resolution will look like dog shit in motion. Lumen and Nanite are nice but maybe at the current state of optimization they are not worth using on these consoles at all...

But the problem is Unreal engine games don't seem to be optimized at all, UE4 is old as fuck and it runs like crap in many games, UE5 is still as CPU limited as UE4 (and can't even use many threads). For comparison you have something like Metro Exodus on consoles that runs 60FPS, dynamic res (1080p AT WORST) and real time RTGI.
 

Mr.Phoenix

Member
If you go by CUs, the 6700 non-xt is what you're supposed to compare to.

In reality, the 6600xt is the same TF on paper. Then again, like Cerny said, lower CU and higher clocks are better. So the 6600xt with fewer CUs and a higher clock should outperform the slower and wider PS5.
You are right. The problem I was always having was that the site I use for GPU and PC review info, for some reason doesn't have the 6700. Bt the 6600xt is an almost spot-on match (at least on raw TF). Thanks for pointing that one out.
 

Bojji

Member
If you go by CUs, the 6700 non-xt is what you're supposed to compare to.

In reality, the 6600xt is the same TF on paper. Then again, like Cerny said, lower CU and higher clocks are better. So the 6600xt with fewer CUs and a higher clock should outperform the slower and wider PS5.

6600XT has piss poor memory bandwidth so it can't be compared directly.

6700 is much closer comparison (but still not perfect).
 

Portugeezer

Member
What's going on? Are Devs trying to use an engine they don't fully understand or can't optimise for? For the sake of using something new and shiney.

Or is UT just has that much shit in it, nothing can run it good.
Epic designed a 1400p30 engine for current gen and some players want performance modes.

Also yes, let's be honest, some of these lesser known developers probably won't be optimising their games so well.
 

hlm666

Member
From what I understand it's native 1080 reconstructed/upscaled to 2160p (in P mode) but some people here really got me thinking it could be something much lower than 1080p (based on other UE5 game)...

Super low resolution will look like dog shit in motion. Lumen and Nanite are nice but maybe at the current state of optimization they are not worth using on these consoles at all...

But the problem is Unreal engine games don't seem to be optimized at all, UE4 is old as fuck and it runs like crap in many games, UE5 is still as CPU limited as UE4 (and can't even use many threads). For comparison you have something like Metro Exodus on consoles that runs 60FPS, dynamic res (1080p AT WORST) and real time RTGI.
Remnant 2 is bottoming out at ~720p at 30-40fps in heavy places on consoles in performance mode, and that's without lumen. If lotf manage to not do worse than that with lumen in use it will probably be considered a win honestly. There is a minor win in the pc space, and that's none of these ue5 games so far have suffered shader comp stutter but even there people are having to hit lower resolutions than they probably expected aswell.
 

winjer

Gold Member
Remnant 2 is bottoming out at ~720p at 30-40fps in heavy places on consoles in performance mode, and that's without lumen. If lotf manage to not do worse than that with lumen in use it will probably be considered a win honestly. There is a minor win in the pc space, and that's none of these ue5 games so far have suffered shader comp stutter but even there people are having to hit lower resolutions than they probably expected aswell.

Has Remnant 2 on consoles been updated with the new patches?
At least on PC, there have been massive improvements to performance. I have seen some of the heaviest scenes in the game doubling performance.
 

BbMajor7th

Member
Has Remnant 2 on consoles been updated with the new patches?
At least on PC, there have been massive improvements to performance. I have seen some of the heaviest scenes in the game doubling performance.
I had a go for the first time this weekend and it seemed reasonably smooth - these were only the early areas though.

On the subject of resolution, I'll wait and see - I didn't think Remnant II looked bad, considering the base resolution. I'll wait and see how this looks when it's up on a screen. I'm still playing Dark Souls II on PS5 - a very shimmery 1080/60, with low poly and low-res textures, but if a game is good, that stuff doesn't weigh too much on the experience. Art direction and animation fluidity are the make or break for me.
 

winjer

Gold Member
I had a go for the first time this weekend and it seemed reasonably smooth - these were only the early areas though.

Two of the places that I noticed huge performance drops when Remnant 2 launched were the boss fight with Abomination and Annihilation.
The first one is an Aberration on N'Erud, so there is a chance you'll find him during the first few hours, depending on RNG. The second is the final boss, so this will take much longer to get to.
 

BbMajor7th

Member
Two of the places that I noticed huge performance drops when Remnant 2 launched were the boss fight with Abomination and Annihilation.
The first one is an Aberration on N'Erud, so there is a chance you'll find him during the first few hours, depending on RNG. The second is the final boss, so this will take much longer to get to.
Yeah, haven't hit either of those yet.
 

Juza

Member
The performance mode goes at 60 frames per second and 1080p resolution upscaled, which is kind of the standard in the industry.

It was the standard in the 2000s.
 
Last edited:

Bernardougf

Member
Not surprising, but I don't think we should be excited or happy about it. Am I wrong in thinking that most people expected higher perf earlier in the gen? 1080p/60 was kinda the claim to fame for PS4Pro and Xbox1X
This are 500 dollars boxes from 2020 ... I think is about right ... people should stop expecting to pay cheap and its cheap and get top notch perfomance .. specially from third party
 
Top Bottom