• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PS5 and Xbox Series X run Cyberpunk 2077: Phantom Liberty at 900p resolution sometimes

NickFire

Member
And you guys still say there's no need for pro consoles.

Im Done I Cant GIF by Chicago Med


*cries in 30fps FF XVI*
My first reaction was “yup, the guys who say we need pro versions are right after all.”

:messenger_weary:
 
Last edited:

Celcius

°Temp. member
I’ll be extremely interested to see how long this gen lasts and whether we get Pro consoles or not. I hope we do get them again though.
 
People that "shame" thoses resolutions are aware that PC use similars resolutions with DLSS or FSR 2 right?

hEKgdLs.png



I'm pretty sure there are a lot of users around here playing at 960p on a 1440p monitor using DLSS in quality mode without knowing it and trying to shame the consoles.
I will for sure be playing at 1440p with either quality or balanced preset on my 3080 to reach around and above 60fps. I’ll likely turn off all RT features too since it absolutely hammers FPS.

Consoles got off ok here I think. Nothing to be concerned about. The average user will likely be very impressed with image quality in performance mode.
 

V1LÆM

Gold Member
Not surprising. They drop last gen hardware, the game is being upgraded with new gameplay systems, improved AI, and apparently enhanced visuals.

PC requirements are being increased. The game is going to be bumped to v2.0 when Phantom Liberty comes out. It's not the same game it was at launch.
 

ryan90k

Neo Member
I find the title misleading as 900p is never displayed and is only a base resolution for reconstruction done by FSR 2. They have 3 months till release so hopefully they can reach 60fps by then. I think FF16 has done much worse since it drops to 720p and uses an oil painting filter... I mean FSR 1 to upscale.
 

ryan90k

Neo Member
Not very surprising. Phantom Liberty is making many improvements to the baseline game that already hardly ran on last-gen consoles so it's effectively a current-gen game and we've seen how tall an order 60fps is for them.

FF XVI drops to 720p with FSR1 so 900p with FSR2 doesn't seem too bad in comparison. Hopefully, it resolves a somewhat clean image because anything below a base resolution of 1440p using FSR2 suffers immensely compared to DLSS and even XeSS.
Facts but I don't think 1440p FSR2 in quality mode looks bad only when you drop to balanced or performance.
 
Last edited:

acm2000

Member
It's not the resolution numbers that matter but the end result which in this case looks really good.

If high numbers turn you on then buy a PC
 

pasterpl

Member
720p with new FF, 900p in new cyberpunk, both with unstable FPS, and people were shitting on Bethesda not adding performance mode to much bigger game - starfield. Good that I have got a decent gaming pc for proper 4k @60+ fps
 
Last edited:
Just play it at 30 then if you want better resolution, same as its always been. Least the devs give you the option to make your own mind up.
 

PeteBull

Member
That 8k logo on the box of my Ps5 looks more hideous as time goes by.
8k was marketing stunt, any demanding game on both xsx or ps5 cant even run in solid 4k60, hell we are lucky we got beauties like FH5 and demons souls both running in 1440p60 on them(ofc cuts were made vs 4k30 modes, but still, its all worth it).
 

01011001

Banned
8k was marketing stunt, any demanding game on both xsx or ps5 cant even run in solid 4k60, hell we are lucky we got beauties like FH5 and demons souls both running in 1440p60 on them(ofc cuts were made vs 4k30 modes, but still, its all worth it).

the fact that the consoles can't even output 8k, while claiming they can, must be count as false advertisement surely
 

Mr.Phoenix

Member
I find there are a lot of hypocrites in threads like these....

Why is reconstruction ok for PC but when it comes to conses its a bad thing?

These bae resolutions that these threads focus on, are no different from what we see on PC using the very same reconstruction techniques. The only real difference is that on consoles the devs maintain a 4K output target and control what the input resolution is, so in this case, that input rez goes from1440p (quality on PC), all the way down to 900p (which is actually higher than ultra-performance on PC).

I just find it strange.
 

FireFly

Member
And you guys still say there's no need for pro consoles.

Im Done I Cant GIF by Chicago Med


*cries in 30fps FF XVI*
The real challenge is in doubling performance while still being able to meet a $499 or even $599 price point and not increasing power consumption. It really depends what happens with RDNA 4 because RDNA 3 has not delivered huge improvements in performance/transistor.
 

squidilix

Member
I dont know why people complaining about resolution, despite the fact the DLSS Quality / Performance with 1440p screen on PC is 979p and 720p respectively...

Nvidia marketing has really brainwashed some people...
 
Last edited:

winjer

Gold Member
Too bad that CDPR is still using FSR2.1.
FSR2.2 is not a huge step, but it still brought several improvements that would improve image quality in motion.

  • Changes to improve "High Velocity Ghosting" situations.
  • Changes to Luminance computation with pre-exposure application.
  • Small motion vectors ignored in previous depth estimation.
  • Changes to depth logic to improve disocclusion detection and avoid self-disocclusions.
  • Dilated reactive mask logic updated to use temporal motion vector divergence to kill locks.
  • New lock luminance resource.
  • Accumulation overhauled to use temporal reactivity.
  • Changed how intermediate signals are stored and tonemapped.
  • Luminance instability logic improved.
  • Tonemapping no longer applied during RCAS to retain more dynamic range.
 

BootsLoader

Banned
Hmmm, I think that CDPR is PC oriented company. They first make their games for PC and then port to console. That’s why we have such problems. Maybe not, who knows. But I can see consoles having a hard time following next gen games.

So 30 FPS is the norm again.
 

RIPN2022

Member
This game honestly should have just been a PC only game, it would have been much better if it was that, even deeper, maybe it would have been the true successor to OG Deus Ex. I think the console release benefits them with money, but it takes a lot of effort that could have been used for the gameplay and rpg systems.
The PS5/Series X is up there with a lot of modern PCs. Its poor optimization by CDPR.
 
I find there are a lot of hypocrites in threads like these....

Why is reconstruction ok for PC but when it comes to conses its a bad thing?

These bae resolutions that these threads focus on, are no different from what we see on PC using the very same reconstruction techniques. The only real difference is that on consoles the devs maintain a 4K output target and control what the input resolution is, so in this case, that input rez goes from1440p (quality on PC), all the way down to 900p (which is actually higher than ultra-performance on PC).

I just find it strange.
Because most of them they aren't using the same reconstruction techniques - Nvidia have 80% of the market so most people are using DLSS.

DLSS at low resolution is pretty good outside of some select games where it is broken - some games still have bad blur or ghosting even after replacing the .dll so for those I wouldn't use DLSS. Obviously anything with DLSS 1 is not worth using as it was really bad.

FSR (1 or 2) at low resolution is dire and I'd rather not bother - if I have to play at a low resolution and a game has DLSS I use DLSS - if a game only has FSR I use native 100% of the time. If you desperately need the performance it is fine but there are drawbacks that are just magnified at low resolution.

DLSS and FSR at low resolution are absolutely not comparable in motion unless you have very poor vision or are very forgiving of artifacts. FSR is fine to jump the gap to 4k from a decent resolution but taking 900p to 4k is just outside of what it is capable of.

Look at any of my moaning on here and you will see that I am very picky about image quality but DLSS for me is more than acceptable even from sub 1080p. It tends to be a bit softer but is very stable without the shimmering and aliasing you get even at 4k native.

To check I'm not talking shit I just tried Cyberpunk 2077 up to 4k using FSR 2.1 performance and DLSS performance (so like 1080p or so I guess?).
-DLSS gives an image worse than native presentation in some ways but actually better in others as it is very stable particularly in fine detail and stuff in the distance - there is no shimmering, artifacting or obvious aliasing.
-FSR looks much worse than native in every way as it has so much shimmering on fine detail and so much aliasing that you'd honestly have to be blind not to notice it in motion.
-Both look good in screenshots so if you are comparing upscaling techniques from screenshots you are fooling yourself but move the camera and watch the pixels crawl when using FSR.

If there is an amazing implementation of FSR2 in any game tell me about it and I will try it but the stuff I've tried recently (RE4R and Jedi Survivor) I disable FSR as I don't need the performance and it is worse all round compared to native whereas I always use DLSS if available.

Maybe there is something in DLSS that I'm just not prone to noticing but the flaws in FSR are so obvious to me that I'd just rather not bother at least from sub 1080p resolutions.

If some magical AA solution arrives that is better than TAA (maybe the Unreal Super Resolution which I've not tried) I'd be happy to re-evaluate my better than native stance as I appreciate that games today are hamstrung by TAA. I actually think the Insomniac TAA upscaling is nice - gives a softer image but a very pleasing one to look at as I'd take a bit of blur over pixels dancing all over the screen.
 
Last edited:

GHG

Member
Just buy a PC if you want higher frames/resolution/fidelity

I have one. But I'd also like the option to play console exclusives in 60fps at non potato resolutions. I shouldn't have to wait for the PC version to release to achieve that.

I bought a switch for TOTK and got so sick of it (the performance/resolution) that I sought other avenues via my PC to play and enjoy the game. I shouldn't have the do that, Nintendo should get their arse in gear.

I'd buy a swith Pro (or 2) and a PS5 Pro day one. For those of you who want to continue to live in the squalor, well... You don't have to buy them.
 
Last edited:

RIPN2022

Member
Not surprising. They drop last gen hardware, the game is being upgraded with new gameplay systems, improved AI, and apparently enhanced visuals.

PC requirements are being increased. The game is going to be bumped to v2.0 when Phantom Liberty comes out. It's not the same game it was at launch.
Its poor optimization at the end of the day.
 

hlm666

Member
Hmmm, I think that CDPR is PC oriented company. They first make their games for PC and then port to console. That’s why we have such problems. Maybe not, who knows. But I can see consoles having a hard time following next gen games.

So 30 FPS is the norm again.
Well the performance of some other recent releases are showing even console focused devs are not getting much better performance, so maybe blaming cdpr for focusing on pc isn't really the whole story. These consoles seem to be struggling when the cpu and gpu are being pushed in tandem. We keep hearing about how good consoles are because of unified memory, but we might be seeing examples of the downside of that approach. Maybe because the cpus last gen were not strong enough and 30fps being the norm cpu latency and memory contention ended up not being an issue and the change to zen2 and higher frame rate targets have made it an issue. Zen2 performance being effected noticably even on the desktop by memory speed/latency.
 

Gaiff

SBI’s Resident Gaslighter
I find there are a lot of hypocrites in threads like these....

Why is reconstruction ok for PC but when it comes to conses its a bad thing?

These bae resolutions that these threads focus on, are no different from what we see on PC using the very same reconstruction techniques. The only real difference is that on consoles the devs maintain a 4K output target and control what the input resolution is, so in this case, that input rez goes from1440p (quality on PC), all the way down to 900p (which is actually higher than ultra-performance on PC).

I just find it strange.
The real difference is that one uses FSR1/FSR2 and the other uses DLSS. People also shit on FSR2 at lower resolutions on PC and they absolutely and 100% dumped on DLSS1 at any resolution because it was garbage. The double standard you guys keep crying about isn't there.
 
Last edited:

fatmarco

Member
To be honest, the graphics do not look anywhere near improved enough to justify the decrease in performance and image quality.
 
I have one. But I'd also like the option to play console exclusives in 60fps at non potato resolutions. I shouldn't have to wait for the PC version to release to achieve that.

I bought a switch for TOTK and got so sick of it (the performance/resolution) that I sought other avenues via my PC to play and enjoy the game. I shouldn't have the do that, Nintendo should get their arse in gear.

I'd buy a swith Pro (or 2) and a PS5 Pro day one. For those of you who want to continue to live in the squalor, well... You don't have to buy them.
This is clearly a joke post.

Games get more complex over time, and the consoles power doesn’t ever increase. Meaning that games require more power to run at all, let alone meet your high expectations.
 

Bernardougf

Member
The real challenge is in doubling performance while still being able to meet a $499 or even $599 price point and not increasing power consumption. It really depends what happens with RDNA 4 because RDNA 3 has not delivered huge improvements in performance/transistor.
Maybe 599 .... but I thinking north of 699 ...they just released a peripheral that cost the price of the console and an 200 dollars controller ... time to embrace the high end crowd and swing for the fences with the pro model ... the market share and goodwill are there
 

Gaiff

SBI’s Resident Gaslighter
To be honest, the graphics do not look anywhere near improved enough to justify the decrease in performance and image quality.
I believe they have been significantly improved compared to the base consoles. The thing these days is, graphics have gotten so complex and have so much visual information that it sometimes becomes hard to spot very obvious differences because there is so much stuff on screen. Back then, when graphics were simple, the differences were blatant. Now, your eyes might just be caught on something else, ignoring other details.
 

Mr.Phoenix

Member
Because most of them they aren't using the same reconstruction techniques - Nvidia have 80% of the market so most people are using DLSS.

DLSS at low resolution is pretty good outside of some select games where it is broken - some games still have bad blur or ghosting even after replacing the .dll so for those I wouldn't use DLSS. Obviously anything with DLSS 1 is not worth using as it was really bad.

FSR (1 or 2) at low resolution is dire and I'd rather not bother - if I have to play at a low resolution and a game has DLSS I use DLSS - if a game only has FSR I use native 100% of the time. If you desperately need the performance it is fine but there are drawbacks that are just magnified at low resolution.

DLSS and FSR at low resolution are absolutely not comparable in motion unless you have very poor vision or are very forgiving of artifacts. FSR is fine to jump the gap to 4k from a decent resolution but taking 900p to 4k is just outside of what it is capable of.

Look at any of my moaning on here and you will see that I am very picky about image quality but DLSS for me is more than acceptable even from sub 1080p. It tends to be a bit softer but is very stable without the shimmering and aliasing you get even at 4k native.

To check I'm not talking shit I just tried Cyberpunk 2077 up to 4k using FSR 2.1 performance and DLSS performance (so like 1080p or so I guess?).
-DLSS gives an image worse than native presentation in some ways but actually better in others as it is very stable particularly in fine detail and stuff in the distance - there is no shimmering, artifacting or obvious aliasing.
-FSR looks much worse than native in every way as it has so much shimmering on fine detail and so much aliasing that you'd honestly have to be blind not to notice it in motion.
-Both look good in screenshots so if you are comparing upscaling techniques from screenshots you are fooling yourself but move the camera and watch the pixels crawl when using FSR.

If there is an amazing implementation of FSR2 in any game tell me about it and I will try it but the stuff I've tried recently (RE4R and Jedi Survivor) I disable FSR as I don't need the performance and it is worse all round compared to native whereas I always use DLSS if available.

Maybe there is something in DLSS that I'm just not prone to noticing but the flaws in FSR are so obvious to me that I'd just rather not bother at least from sub 1080p resolutions.

If some magical AA solution arrives that is better than TAA (maybe the Unreal Super Resolution which I've not tried) I'd be happy to re-evaluate my better than native stance as I appreciate that games today are hamstrung by TAA. I actually think the Insomniac TAA upscaling is nice - gives a softer image but a very pleasing one to look at as I'd take a bit of blur over pixels dancing all over the screen.
Ok cool. Can you do me a favor?

Scrub through that DF video, find a screenshot from the video and show me what these very obvious, visible and immersion-breaking drawbacks of FSR2s implementation is on the current-gen consoles in this game. As sen in that video. Or not what are we even talking about?

like we work in crazy extremes here on GAF, focusing more on the tech than on its implementation and results. I can see the flaws of FSR2 in games like SW.JS... but I have looked at this video and saw none of that. People keep talking about Dlss... we, these consoles use an AMD APU, we do not get Dlss... so why even bring it up, thats like me going into every Nintendo Switch game and talking about how they don't have RT. And not everyone that owns a PC has an Nvidia card either.

And no, I do not believe FSR looks worse than native 900p. The game has an FSR scale of 900p to 1440p. And we are seeing that this is the preferred method of using the tech by devs now. That means that it dynamically goes from what would be 4K FSR ultra-performance (720p native internal res) to FSR quality (1440p internal rez). And hyperbole and exaggeration aside, it's not like its just sitting at its worst possible performance point 90% of the time, it's probably the other way around.

I'm not making shit up here, just stating the facts, and I find it disingenuous at best, or flat-out hypocritical at worse that people who seem to know so much about reconstruction techniques, dynamic resolutions, and tech, in general, seem to be blind to this and just come into threads like these and say `er duh, 900p consoles!`.
 

winjer

Gold Member
Ok cool. Can you do me a favor?

Scrub through that DF video, find a screenshot from the video and show me what these very obvious, visible and immersion-breaking drawbacks of FSR2s implementation is on the current-gen consoles in this game. As sen in that video. Or not what are we even talking about?

like we work in crazy extremes here on GAF, focusing more on the tech than on its implementation and results. I can see the flaws of FSR2 in games like SW.JS... but I have looked at this video and saw none of that. People keep talking about Dlss... we, these consoles use an AMD APU, we do not get Dlss... so why even bring it up, thats like me going into every Nintendo Switch game and talking about how they don't have RT. And not everyone that owns a PC has an Nvidia card either.

And no, I do not believe FSR looks worse than native 900p. The game has an FSR scale of 900p to 1440p. And we are seeing that this is the preferred method of using the tech by devs now. That means that it dynamically goes from what would be 4K FSR ultra-performance (720p native internal res) to FSR quality (1440p internal rez). And hyperbole and exaggeration aside, it's not like its just sitting at its worst possible performance point 90% of the time, it's probably the other way around.

I'm not making shit up here, just stating the facts, and I find it disingenuous at best, or flat-out hypocritical at worse that people who seem to know so much about reconstruction techniques, dynamic resolutions, and tech, in general, seem to be blind to this and just come into threads like these and say `er duh, 900p consoles!`.

The implementation of FSR2 on Jedi Survivor is the 2.0 version, which is the worst version. Version 2.1 had major improvements in image quality. And 2.2 improved on it.
Some people just can't understand this and think everything is the same.
 

ryan90k

Neo Member
Because most of them they aren't using the same reconstruction techniques - Nvidia have 80% of the market so most people are using DLSS.

DLSS at low resolution is pretty good outside of some select games where it is broken - some games still have bad blur or ghosting even after replacing the .dll so for those I wouldn't use DLSS. Obviously anything with DLSS 1 is not worth using as it was really bad.

FSR (1 or 2) at low resolution is dire and I'd rather not bother - if I have to play at a low resolution and a game has DLSS I use DLSS - if a game only has FSR I use native 100% of the time. If you desperately need the performance it is fine but there are drawbacks that are just magnified at low resolution.

DLSS and FSR at low resolution are absolutely not comparable in motion unless you have very poor vision or are very forgiving of artifacts. FSR is fine to jump the gap to 4k from a decent resolution but taking 900p to 4k is just outside of what it is capable of.

Look at any of my moaning on here and you will see that I am very picky about image quality but DLSS for me is more than acceptable even from sub 1080p. It tends to be a bit softer but is very stable without the shimmering and aliasing you get even at 4k native.

To check I'm not talking shit I just tried Cyberpunk 2077 up to 4k using FSR 2.1 performance and DLSS performance (so like 1080p or so I guess?).
-DLSS gives an image worse than native presentation in some ways but actually better in others as it is very stable particularly in fine detail and stuff in the distance - there is no shimmering, artifacting or obvious aliasing.
-FSR looks much worse than native in every way as it has so much shimmering on fine detail and so much aliasing that you'd honestly have to be blind not to notice it in motion.
-Both look good in screenshots so if you are comparing upscaling techniques from screenshots you are fooling yourself but move the camera and watch the pixels crawl when using FSR.

If there is an amazing implementation of FSR2 in any game tell me about it and I will try it but the stuff I've tried recently (RE4R and Jedi Survivor) I disable FSR as I don't need the performance and it is worse all round compared to native whereas I always use DLSS if available.

Maybe there is something in DLSS that I'm just not prone to noticing but the flaws in FSR are so obvious to me that I'd just rather not bother at least from sub 1080p resolutions.

If some magical AA solution arrives that is better than TAA (maybe the Unreal Super Resolution which I've not tried) I'd be happy to re-evaluate my better than native stance as I appreciate that games today are hamstrung by TAA. I actually think the Insomniac TAA upscaling is nice - gives a softer image but a very pleasing one to look at as I'd take a bit of blur over pixels dancing all over the screen.
The question then becomes "what is the right solution for consoles?" as DLSS isn't an option so what are they to use instead of FSR 2 as I don't see any better solutions and native with TAA would also have pixel crawl if it was rendered at a low resolution and already has ghosting/trailing which can be worse than reconstructive techniques.
 
Last edited:
Ok cool. Can you do me a favor?

Scrub through that DF video, find a screenshot from the video and show me what these very obvious, visible and immersion-breaking drawbacks of FSR2s implementation is on the current-gen consoles in this game. As sen in that video. Or not what are we even talking about?

like we work in crazy extremes here on GAF, focusing more on the tech than on its implementation and results. I can see the flaws of FSR2 in games like SW.JS... but I have looked at this video and saw none of that. People keep talking about Dlss... we, these consoles use an AMD APU, we do not get Dlss... so why even bring it up, thats like me going into every Nintendo Switch game and talking about how they don't have RT. And not everyone that owns a PC has an Nvidia card either.

And no, I do not believe FSR looks worse than native 900p. The game has an FSR scale of 900p to 1440p. And we are seeing that this is the preferred method of using the tech by devs now. That means that it dynamically goes from what would be 4K FSR ultra-performance (720p native internal res) to FSR quality (1440p internal rez). And hyperbole and exaggeration aside, it's not like its just sitting at its worst possible performance point 90% of the time, it's probably the other way around.

I'm not making shit up here, just stating the facts, and I find it disingenuous at best, or flat-out hypocritical at worse that people who seem to know so much about reconstruction techniques, dynamic resolutions, and tech, in general, seem to be blind to this and just come into threads like these and say `er duh, 900p consoles!`.
I mean that entire video looks dreadful so if your argument is that this video represents acceptable image quality then I'm not sure we will ever be on the same page. Probably as it's a video CDPR sent to DF that has been compressed again to put on YouTube.

If Cyberpunk looked anywhere near that bad on my screen I wouldn't play it.
 
The question then becomes "what is the right solution for consoles?" as DLSS isn't an option so what are they to use instead of FSR 2 as I don't see any better solutions and native with TAA would also have pixel crawl if it was rendered at a low resolution and already has ghosting/trailing which can be worse than reconstructive techniques.
FSR2 is ok if that's the only option I was just addressing the fact that people are apparently hypocrites for using DLSS on PC when it is light years ahead of FSR - the two simply aren't comparable. If it takes FSR to get 60fps on a console then FSR is 100% the right choice.

Again if there is a game where FSR is apparently good quality let me know and if I have it I'll try it but it is FSR version 2.1 in Cyberpunk 2077 and looks crap in motion in comparison to DLSS.
 

Gaiff

SBI’s Resident Gaslighter
FSR2 is ok if that's the only option I was just addressing the fact that people are apparently hypocrites for using DLSS on PC when it is light years ahead of FSR - the two simply aren't comparable. If it takes FSR to get 60fps on a console then FSR is 100% the right choice.

Again if there is a game where FSR is apparently good quality let me know and if I have it I'll try it but it is FSR version 2.1 in Cyberpunk 2077 and looks crap in motion in comparison to DLSS.
FSR2 is pretty solid at 4K Quality.
 

DenchDeckard

Moderated wildly
As soon as we started getting next gen only games this generation has suffered.

Xbox managing to show its slight advantage on next gen content again plus it will operate in the consoles vrr window...but we need pro consoles probably asap for 60fps. That or go pc.

Alan wake 2 is gonna be 30fps and that's sadge...pc anyway for rayteacing but nm.
 
Last edited:
Top Bottom