• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Star Wars Jedi Survivor PS5 Pro: Severe Image Quality Problems... But RT in Performance Mode Is Nice

twilo99

Member
Eh....not sure how reliable any of that is, but ok.



lol...wrong. It is pointed out across the board whether it is PS, Xbox or PC. The fact that you are suggesting this only occurs with "Sony hardware" is the selective bias here.

That’s why I said “Most of the time”

Of course there is talk about “bad ports” on PC, Xbox, etc. but just keep this in mind with some of the upcoming titles on Xbox and observe the reactions in case those games don’t run well on the hardware…
 

Topher

Identifies as young
That’s why I said “Most of the time”

Of course there is talk about “bad ports” on PC, Xbox, etc. but just keep this in mind with some of the upcoming titles on Xbox and observe the reactions in case those games don’t run well on the hardware…

I don't have to. I've already seen the reactions to Xbox games performing worse than their PS counterparts and it is attributed to poor optimization every single time. When games are worse on PS, same thing. I've seen PC gamers complain about shitty PC ports from devs plenty. I know because I've been one of them. So no, there is nothing to "keep in mind". We've been down this road plenty of times.
 

SlimySnake

Flashless at the Golden Globes
These devs that struggle with low res on the base ps5 know that they don't have to to use PSSR on the pro right?

Lord of the fallen devs just used the engine upscaler with 40 percent increase in internal res, empire of the ants look super sharp with 1440 internal res also using UE5 upscaler, and so is Lie of Pies.

Unless you have an engine light enough with high internal res to resolve a good looking image and stability then why choose sub 1080p res with PSSR?

Would you rather have AW2 running at 1200p with FSR or PSSR running at 860p with barely noticeable graphic?

same thing for this game too. Would you rather have this running at average around 1400p with FSR or the current performance mode around 1000p? The same gona apply to SH2 too.

I am sorry Sony is not forcing PSSR on these devs, but it is bizarre to see them opt for lower resolutions with PSSR instead of high resolution with FSR.
There is not enough power here to go from 860p to 1200p. Even by Cerny's own claims, its 45% but we've seen it top out at 30-35%. 860p to 1080p would require almost 100%. 860p to 1200p would required 150% more GPU.

This isnt the PS4 Pro. You dont have this much extra power available so devs have their hands tied. They were told PSSR is this great DLSS like AI upscaler but its clearly not. At least not right now. At least not at those low resolutions.

To go from 1080p to 1440p, you will need 75% more GPU. You only have around half of that here. So 1080p would become 1296p at best. But we know from DF's own pixel counting, it wasnt 1080p and dropped below that.
 
Last edited:

Kangx

Member from Brazile
There is not enough power here to go from 860p to 1200p. Even by Cerny's own claims, its 45% but we've seen it top out at 30-35%. 860p to 1080p would require almost 100%. 860p to 1200p would required 150% more GPU.

This isnt the PS4 Pro. You dont have this much extra power available so devs have their hands tied. They were told PSSR is this great DLSS like AI upscaler but its clearly not. At least not right now. At least not at those low resolutions.

To go from 1080p to 1440p, you will need 75% more GPU. You only have around half of that here. So 1080p would become 1296p at best. But we know from DF's own pixel counting, it wasnt 1080p and dropped below that.
Wait what? How is going from 860p to 1080p is 100% power but going from 1080p to 1440p require 75% power?

And why is 860p to 1080p is 100%? Can some body do the math going from 860p to 1200p is how much percentage?

I don't get your logic though. 30-35 percent is just pure brute force. When enhanced it is different game to game.

How do you explain Baldur Gate 3 going from 1080p to 1440p? Lord of the Fallen devs stated that there are 40 percent improved in resolution. Lies of pi going from 1500p to 1800p to Native 4k. Empire of the ants run 1440p at 60fps where the base have a bit higher res but run at 30fps.
 

SlimySnake

Flashless at the Golden Globes
Wait what? How is going from 860p to 1080p is 100% power but going from 1080p to 1440p require 75% power?

And why is 860p to 1080p is 100%? Can some body do the math going from 860p to 1200p is how much percentage?

I don't get your logic though. 30-35 percent is just pure brute force. When enhanced it is different game to game.

How do you explain Baldur Gate 3 going from 1080p to 1440p? Lord of the Fallen devs stated that there are 40 percent improved in resolution. Lies of pi going from 1500p to 1800p to Native 4k. Empire of the ants run 1440p at 60fps where the base have a bit higher res but run at 30fps.
Resolutions are measured on two axis. 1080p is 1920*1080 pixels. Thats 2.1 million pixels. 1440p is 2560*1440 or 3.68 million pixels. Basic math tells us it would require 75% more GPU to render 75% more pixels.

800p is 1200*800 = 1 million pixels. 1080p is 2.1 million pixels. 100% more pixels = 100% more GPU power. At the minimum. You might also need more vram bandwidth to ensure there are no bottlenecks. In PS5 Pro's case, their 67% more powerful GPU is topping out at 45% more performance because they only upgraded the vram by 25% and didnt include the cache that was in PC cards to make them perform up to spec. But typically, if the GPU is well designed with no other bottlenecks, you will need 100% more power to render 100% more pixels.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Resolutions are measured on two axis. 1080p is 1920*1080 pixels. Thats 2.1 million pixels. 1440p is 2560*1440 or 3.68 million pixels. Basic math tells us it would require 75% more GPU to render 75% more pixels.

800p is 1200*800 = 1 million pixels. 1080p is 2.1 million pixels. 100% more pixels = 100% more GPU power. At the minimum. You might also need more vram bandwidth to ensure there are no bottlenecks. In PS5 Pro's case, their 67% more powerful GPU is topping out at 45% more performance because they only upgraded the vram by 25% and didnt include the cache that was in PC cards to make them perform up to spec. But typically, if the GPU is well designed with no other bottlenecks, you will need 100% more power to render 100% more pixels.
This makes the 4090 around 4x stronger than the 4070 rather than 2x. Performance deltas are calculated using frame rate/frame time. Not everything scales with resolution, so I’m unsure why you keep using it to calculate the entire performance profile of a given GPU.
 

XXL

Member
I don't have to. I've already seen the reactions to Xbox games performing worse than their PS counterparts and it is attributed to poor optimization every single time. When games are worse on PS, same thing. I've seen PC gamers complain about shitty PC ports from devs plenty. I know because I've been one of them. So no, there is nothing to "keep in mind". We've been down this road plenty of times.
"Lazy devs" is pretty much the built in excuse for every platform with a bad port.
 
Last edited:

Zathalus

Member
There is not enough power here to go from 860p to 1200p. Even by Cerny's own claims, its 45% but we've seen it top out at 30-35%. 860p to 1080p would require almost 100%. 860p to 1200p would required 150% more GPU.

This isnt the PS4 Pro. You dont have this much extra power available so devs have their hands tied. They were told PSSR is this great DLSS like AI upscaler but its clearly not. At least not right now. At least not at those low resolutions.

To go from 1080p to 1440p, you will need 75% more GPU. You only have around half of that here. So 1080p would become 1296p at best. But we know from DF's own pixel counting, it wasnt 1080p and dropped below that.
Resolution scaling doesn’t require a 1:1 increase in GPU power. Going from 1080p to native 4K in Alan Wake 2 doesn’t require a 4x jump in GPU power, just over half of that in fact.
 

Gaiff

SBI’s Resident Gaslighter
Resolution scaling doesn’t require a 1:1 increase in GPU power. Going from 1080p to native 4K in Alan Wake 2 doesn’t require a 4x jump in GPU power, just over half of that in fact.
This. The 4090 at 4K performs like a 4070 at 1080p, but at the same resolution without a CPU bottleneck, the 4090 will deliver around twice the frame rate, not four times.
 

Lysandros

Member
It's a just a particular user's wishful speculation. Sony can't just shamelessly steal Intel's own solution and get away with it, that's ridiculous. I really don't know what is going on with people assuming absolutely the worst from Sony when it come to work ethics, originality and competence. It's becoming comical at this point, as if the company is the apex of pathological lies or something.
 
Last edited:

Bojji

Member
Resolutions are measured on two axis. 1080p is 1920*1080 pixels. Thats 2.1 million pixels. 1440p is 2560*1440 or 3.68 million pixels. Basic math tells us it would require 75% more GPU to render 75% more pixels.

800p is 1200*800 = 1 million pixels. 1080p is 2.1 million pixels. 100% more pixels = 100% more GPU power. At the minimum. You might also need more vram bandwidth to ensure there are no bottlenecks. In PS5 Pro's case, their 67% more powerful GPU is topping out at 45% more performance because they only upgraded the vram by 25% and didnt include the cache that was in PC cards to make them perform up to spec. But typically, if the GPU is well designed with no other bottlenecks, you will need 100% more power to render 100% more pixels.

This console is too weak to fix the most demanding games. And that's with 45% increase in power, we now know that it's probably lower than that.
 
This console is too weak to fix the most demanding games. And that's with 45% increase in power, we now know that it's probably lower than that.

no the developers are too weak

AW2 and star wars are not the best looking games on the system, are not open world, the developers are just not good at or are not focusing on optimizing for consoles

There’s no excuse for them to be running at such low rez on base consoles to begin with
 

Kangx

Member from Brazile
Resolutions are measured on two axis. 1080p is 1920*1080 pixels. Thats 2.1 million pixels. 1440p is 2560*1440 or 3.68 million pixels. Basic math tells us it would require 75% more GPU to render 75% more pixels.

800p is 1200*800 = 1 million pixels. 1080p is 2.1 million pixels. 100% more pixels = 100% more GPU power. At the minimum. You might also need more vram bandwidth to ensure there are no bottlenecks. In PS5 Pro's case, their 67% more powerful GPU is topping out at 45% more performance because they only upgraded the vram by 25% and didnt include the cache that was in PC cards to make them perform up to spec. But typically, if the GPU is well designed with no other bottlenecks, you will need 100% more power to render 100% more pixels.
OK. So going from 800p to 1080p require 100% power, then the pro should be in the 900p range by 45 percent uplift in power?

Damn, sony created the pro for a measy around 100p increase from 800p according to your calculations.

You did not answer me why can badur gate 3 going from 1080p to 1440p on the pro since you so insist on the 30 percentage? Also empire of ants running at 60fps at 1440p on the pro where the ps5 stick 30fps with a bit higher resolution.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
It's a just a particular user's wishful speculation. Sony can't just shamelessly steal Intel's own solution and get away with it, that's ridiculous. I really don't know what is going on with people assuming absolutely the worst from Sony when it come to work ethics, originality and competence. It's becoming comical at this point, as if the company is the apex of pathological lies or something.
I’m not too familiar with software patent laws, but the XeSS legal documents on github clearly state no reverse engineering, decompilation, or disassembly of XeSS is permitted.

I don’t think Sony would ever just steal Intel’s homework and get away with it. Or maybe they would if it weren’t for them meddling kids.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You did not answer me why can badur gate 3 going from 1080p to 1440p on the pro since you so insist on the 30 percentage? Also empire of ants running at 60fps at 1440p on the pro where the ps5 stick 30fps with a bit higher resolution.
its probably not doing that. these games are all using dynamic resolutions nowadays making it harder to count pixels.

you are never going to magically get 100% more performance from a weaker GPU. if that was the case, no one would ever need to buy a GPU.
 

Zathalus

Member
Impossible.

4080 is 55fps at 4K, a 6700 XT is doing 59fps at 1080p. The 4080 is just over twice as fast as the 6700 XT, not four times as fast.

Just looking at the 4080 you’d see it does 122fps at 1080p. Assuming performance and resolution scale 1:1 you’d expect the 4080 to be 30fps at 4K, but it’s sitting nicely at 55fps.
 

SlimySnake

Flashless at the Golden Globes

4080 is 55fps at 4K, a 6700 XT is doing 59fps at 1080p. The 4080 is just over twice as fast as the 6700 XT, not four times as fast.

Just looking at the 4080 you’d see it does 122fps at 1080p. Assuming performance and resolution scale 1:1 you’d expect the 4080 to be 30fps at 4K, but it’s sitting nicely at 55fps.
It could be CPU bound at 1080p.

You need a more powerful GPU to render more pixels. Sorry, but thats just basic common sense. There is literally nothing more to discuss.
 

Topher

Identifies as young
This console is too weak to fix the most demanding games. And that's with 45% increase in power, we now know that it's probably lower than that.

Whatever the cap is, Pro ain't going to brute force through issues like high end PC GPUs. Devs need to be smart in how the utilize this increase in power over the PS5. But yeah....for a game like Jedi Survivor that has had issues from the start, Pro isn't going to act like a 4090 and brute force through and get the desired result, just at a lower frame rate.
 

Kangx

Member from Brazile
its probably not doing that. these games are all using dynamic resolutions nowadays making it harder to count pixels.

you are never going to magically get 100% more performance from a weaker GPU. if that was the case, no one would ever need to buy a GPU.
OK.


Oliver count pixel consistently at 960p. Ps5 at 960p with upscale to 1440p. Pro with 1440p to 4k. At 30fps the pro is at native 4k vs 1440p on the ps5. The pro also have 60fps in split screen.





What do you notice from these video?
Different in resolution or frame rate?
 

Zathalus

Member
It could be CPU bound at 1080p.

You need a more powerful GPU to render more pixels. Sorry, but thats just basic common sense. There is literally nothing more to discuss.
Resolution is just one part of the rendering pipeline, increasing or decreasing the resolution doesn’t change the entire rendering budget. Some things like textures or geometry have a fixed rendering cost or are not influenced heavily by resolution changes. Some post-processing effects are like this as well.

It’s also quite clearly not CPU limited at 1080p considering the 4090 does over 150fps at 1080p. But if you still think that is the case just look at a slower GPU. The 3090 is 43fps at 4K, and at 1080p it comes in at 99fps, not the 172fps you’d expect at 1080p. The 6600XT is doing 44fps at 1080p with the 3090 again being roughly twice as powerful, and yet still gets similar fps at 4x the resolution.

Now I’m not saying a increase of 4x in resolution requires only a 2x in GPU power. The difference can be influenced by the game engine, memory bandwidth, what post-processing it uses, etc…. You’d hardly ever find a perfect 1:1 resolution to GPU increase though.
 

yogaflame

Member
It is not even a month for the life cycle of ps5 pro. Updates and patches will surely come, and as we know the nature of PSSR ML is it continues to improve, learn and evolve as time passes and by next year more games will be develop with ps5 pro in mind. Patience.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Resolution is just one part of the rendering pipeline, increasing or decreasing the resolution doesn’t change the entire rendering budget. Some things like textures or geometry have a fixed rendering cost or are not influenced heavily by resolution changes. Some post-processing effects are like this as well.

It’s also quite clearly not CPU limited at 1080p considering the 4090 does over 150fps at 1080p. But if you still think that is the case just look at a slower GPU. The 3090 is 43fps at 4K, and at 1080p it comes in at 99fps, not the 172fps you’d expect at 1080p. The 6600XT is doing 44fps at 1080p with the 3090 again being roughly twice as powerful, and yet still gets similar fps at 4x the resolution.

Now I’m not saying a increase of 4x in resolution requires only a 2x in GPU power. The difference can be influenced by the game engine, memory bandwidth, what post-processing it uses, etc…. You’d hardly ever find a perfect 1:1 resolution to GPU increase though.
You have a 3080 right? just test this.

it might not be 1:1 but its very close. Ive been playing games on PC all my life and the best way to gain more performance is by reducing resolution. There are exceptions like Star Wars Jedi Survivor which is CPU bound and TLOU1 which was vram bound or Black Myth Path Tracing which is utter trash on 30 series cards, but if you are a PC gamer, you should know that you either downgrade settings or resolution to gain more frames.
 

Clear

CliffyB's Cock Holster
Resolution is just one part of the rendering pipeline, increasing or decreasing the resolution doesn’t change the entire rendering budget. Some things like textures or geometry have a fixed rendering cost or are not influenced heavily by resolution changes. Some post-processing effects are like this as well.

It’s also quite clearly not CPU limited at 1080p considering the 4090 does over 150fps at 1080p. But if you still think that is the case just look at a slower GPU. The 3090 is 43fps at 4K, and at 1080p it comes in at 99fps, not the 172fps you’d expect at 1080p. The 6600XT is doing 44fps at 1080p with the 3090 again being roughly twice as powerful, and yet still gets similar fps at 4x the resolution.

Now I’m not saying a increase of 4x in resolution requires only a 2x in GPU power. The difference can be influenced by the game engine, memory bandwidth, what post-processing it uses, etc…. You’d hardly ever find a perfect 1:1 resolution to GPU increase though.

Exactly this... As I've said before to the purveyors of the argument that x thing is cpu-bound; Ok, so now explain to me EXACTLY where those CPU cycles are going?

If you can explain that, then you can pin it down to specifically where the stress-points are in the pipeline, and more to the point what part of the hardware specifically is holding things up performance wise.

Just saying because you observe this behaviour on PC, why would it follow that systems that do not exactly correspond architecturally exhibit the same issues?

Its the exact same wooly thinking that leads to people predicting performance based on Tf count on the gpu, and we all now have seen how inaccurate that turned out to be!

Yes, you can judge things like that in isolation ***WHEN ALL ELSE IS EQUAL*** like when swapping components in and out on a PC. However when that isn't the case... The math stops working.

But hey, while amateurs like DF keep parroting the same simplistic formulations, either out of ignorance or the misguided thought that the reality is simply too complex and nuanced to spoon-feed to their audience, these fallacies will remain unchallenged.
 

SlimySnake

Flashless at the Golden Globes
OK.


Oliver count pixel consistently at 960p. Ps5 at 960p with upscale to 1440p. Pro with 1440p to 4k. At 30fps the pro is at native 4k vs 1440p on the ps5. The pro also have 60fps in split screen.





What do you notice from these video?
Different in resolution or frame rate?

PS5 is locked at 30 fps. We dont know how much extra headroom it has available. A lot of PS5 games massive framerate upgrades when the VRR patch came out and they unlocked the framerate. Uncharted collection, TLOU, Ratchet, Spiderman were all running in mid 40s. its possible the empire of the ants also ran above 30 fps. Hence the large DRS range.

I have literally done tests on my own TV and the PS5 is not doubling the performance in any game. Spiderman 2, Ratchet, TLOu1 and TLOU2. if you have a Pro and an LG tv, repeatedly press the green button and it will show you the framerate. You dont need DF anymore.
 

Zathalus

Member
You have a 3080 right? just test this.

it might not be 1:1 but its very close. Ive been playing games on PC all my life and the best way to gain more performance is by reducing resolution. There are exceptions like Star Wars Jedi Survivor which is CPU bound and TLOU1 which was vram bound or Black Myth Path Tracing which is utter trash on 30 series cards, but if you are a PC gamer, you should know that you either downgrade settings or resolution to gain more frames.
I’m not claiming reducing resolution doesn’t give you back performance. It’s just very rarely a 1:1 increase. Dropping from 4K to 1080p can often more then double your frames, but you’d very, very rarely get a 4x increase. That being said if you are VRAM limited at 4K you can get a even larger then 4x increase in your frame rate, but that's due to the GPU being forced to swap out VRAM.
 

Gaiff

SBI’s Resident Gaslighter
You have a 3080 right? just test this.

it might not be 1:1 but its very close. Ive been playing games on PC all my life and the best way to gain more performance is by reducing resolution. There are exceptions like Star Wars Jedi Survivor which is CPU bound and TLOU1 which was vram bound or Black Myth Path Tracing which is utter trash on 30 series cards, but if you are a PC gamer, you should know that you either downgrade settings or resolution to gain more frames.
No, it isn’t very close at all. Going from 1080p to 4K won’t cut your performance to 1/4th unless you run into VRAM problems. It’s much more common that a 100% in resolution will reduce performance by 50%.

Find us games where going from 4K to 1080p and vice-versa results in an exactly proportional performance decrease/increase based on the resolution.
 

Kangx

Member from Brazile
PS5 is locked at 30 fps. We dont know how much extra headroom it has available. A lot of PS5 games massive framerate upgrades when the VRR patch came out and they unlocked the framerate. Uncharted collection, TLOU, Ratchet, Spiderman were all running in mid 40s. its possible the empire of the ants also ran above 30 fps. Hence the large DRS range.

I have literally done tests on my own TV and the PS5 is not doubling the performance in any game. Spiderman 2, Ratchet, TLOu1 and TLOU2. if you have a Pro and an LG tv, repeatedly press the green button and it will show you the framerate. You dont need DF anymore.
You are going all over the places and you have not address the badur gate 3.

I think we can stop here lol. I will let 2 other people debate with you.
 

SlimySnake

Flashless at the Golden Globes
I’m not claiming reducing resolution doesn’t give you back performance. It’s just very rarely a 1:1 increase. Dropping from 4K to 1080p can often more then double your frames, but you’d very, very rarely get a 4x increase. That being said if you are VRAM limited at 4K you can get a even larger then 4x increase in your frame rate, but that's due to the GPU being forced to swap out VRAM.
We are not doing that on here though. We are talking about PS5 Pro games that run at 860p or 1080p simply increasing the resolution in their current 60 fps modes. No one is talking about going from 1080p to 4k. this isnt an x1 to x1x caliber leap. We just want to go from 800p to 1080p on the same framerate between two consoles with a roughly 45% performance delta. And we just cant.

if we couldve done it then we wouldve seen devs go ahead and do that. Instead they just used the extra GPU power on upgrading some settings and call it a day. How many games have we seen DF cover that have literally remained the same resolution as the PS5 modes? In AW2, you get RT. Same resolution. Same framerate. In Star Wars, you get RT, same resolution, same framerate. In Dragon Age, RT, same resolution. RT reflections dont cost 100% more GPU. 35% at best. And we've seen that 35% figure in many other games that simply used the gpu to render more frames.
 

Mister Wolf

Member
We are not doing that on here though. We are talking about PS5 Pro games that run at 860p or 1080p simply increasing the resolution in their current 60 fps modes. No one is talking about going from 1080p to 4k. this isnt an x1 to x1x caliber leap. We just want to go from 800p to 1080p on the same framerate between two consoles with a roughly 45% performance delta. And we just cant.

if we couldve done it then we wouldve seen devs go ahead and do that. Instead they just used the extra GPU power on upgrading some settings and call it a day. How many games have we seen DF cover that have literally remained the same resolution as the PS5 modes? In AW2, you get RT. Same resolution. Same framerate. In Star Wars, you get RT, same resolution, same framerate. In Dragon Age, RT, same resolution. RT reflections dont cost 100% more GPU. 35% at best. And we've seen that 35% figure in many other games that simply used the gpu to render more frames.

This Pro system is a disappointment. They should have made it more powerful.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You are going all over the places and you have not address the badur gate 3.

I think we can stop here lol. I will let 2 other people debate with you.
Baldur's Gate 3 is literally locked at 1440p 30 fps. It's the same as Empire of the Ants. You cannot determine the actual performance of the GPU by looking at capped framerates.

Again, if you have a Pro, or a PC, you can test this yourself. No need to listen to me, debate me or listen to DF. Go run your own tests and come to your own conclusions.
 

bender

What time is it?
I don't have to. I've already seen the reactions to Xbox games performing worse than their PS counterparts and it is attributed to poor optimization every single time. When games are worse on PS, same thing. I've seen PC gamers complain about shitty PC ports from devs plenty. I know because I've been one of them. So no, there is nothing to "keep in mind". We've been down this road plenty of times.

Keep in mind, we've never been down this road before.
 

Kangx

Member from Brazile
Baldur's Gate 3 is literally locked at 1440p 30 fps. It's the same as Empire of the Ants. You cannot determine the actual performance of the GPU by looking at capped framerates.

Again, if you have a Pro, or a PC, you can test this yourself. No need to listen to me, debate me or listen to DF. Go run your own tests and come to your own conclusions.
Wait a damn minute. I don't get your logic. So the pro is able going from 1440p to 4k is not count because the game is locked at 30 on the ps5? We also don't know how high the frame rate on the pro with a locked 30 at native 4k too.

Why did the devs locked the game at 30fp at 1440p? Why not increase the res to 1600p? And why the pro going for native 4k and not 1800p?

Exactly, the performance would not allowed it on the ps5 and the pro have more than performance to run at 4k instead of lower resolution. Man, your logic is warped.
 
Last edited:

Zathalus

Member
We are not doing that on here though. We are talking about PS5 Pro games that run at 860p or 1080p simply increasing the resolution in their current 60 fps modes. No one is talking about going from 1080p to 4k. this isnt an x1 to x1x caliber leap. We just want to go from 800p to 1080p on the same framerate between two consoles with a roughly 45% performance delta. And we just cant.

if we couldve done it then we wouldve seen devs go ahead and do that. Instead they just used the extra GPU power on upgrading some settings and call it a day. How many games have we seen DF cover that have literally remained the same resolution as the PS5 modes? In AW2, you get RT. Same resolution. Same framerate. In Star Wars, you get RT, same resolution, same framerate. In Dragon Age, RT, same resolution. RT reflections dont cost 100% more GPU. 35% at best. And we've seen that 35% figure in many other games that simply used the gpu to render more frames.
35% should be fine to increase the resolution significantly (assuming all else stays the same). Just going back to that Alan Wake 2 comparison you can go from 1080p60 to 1440p60 just from going from a 4060ti to a 4070, despite the gap in performance between the two being nowhere near that resolution gap. Changing resolution from 1080p to 1440p (or 900p to 1200p) very rarely requires the 75% performance difference you would need from just the resolution numbers.

Saying ‘oh you need a X% more powerful GPU to increase the resolution by that same X%’ is almost never true, as graphical load and resolution are simply not tied 1:1.
 

SlimySnake

Flashless at the Golden Globes
Wait a damn minute. I don't get your logic. So the pro is able going from 1440p to 4k is not count because the game is locked at 30 on the ps5? We also don't know how high the frame rate on the pro with a locked 30 at native 4k too.

Why did the devs locked the game at 30fp at 1440p? Why not increase the res to 1600p? And why the pro going for native 4k and not 1800p?

Exactly, the performance would not allowed it. Man, your logic is warped.
No, I am saying its not a good test because we literally cant measure the delta. We have other games that let us measure the delta. Literally a dozen of them that show that you cannot go from 800p to 1080p. You are in a thread about a game that has shot by shot comparisons in the video and none of them show a 100% gain in pixels. Let alone the 1200p claims made by the developer themselves.

H2cmVqH.jpeg


This is what DF found.

iwfYNnG.jpeg


Literally one shot that actually goes to 1224p. The rest are identical. You keep asking why the devs wouldnt just increase performance and here is your answer. They literally cant. The DRS has a range of 648p-1224p and it stays at that same exact range because they spent the entire GPU on rendering RTAO and RT reflections. If the GPU was 100% more powerful like you want it to be, we would see at least 50% more pixels here on TOP of the RTAO and RT Reflections effects.
 

Gaiff

SBI’s Resident Gaslighter
Random sample of games.

Alan Wake 2:
performance-1920-1080.png

performance-3840-2160.png

God of War Ragnarok:

performance-1920-1080.png
performance-3840-2160.png

Frontiers of Pandora:

performance-1920-1080.png
performance-3840-2160.png

Lords of the Fallen:

performance-1920-1080.png
performance-3840-2160.png

Black Myth Wukong:

HQ-Na-1080p-p.webp

HQ-Na-2160p-p.webp

The pixel count almost never has a 1:1 impact on performance. In fact, SlimySnake SlimySnake would be much closer to being right if he made that argument for the axis rather than the total. Increasing the axis by 2x might result in a 2x performance reduction. Increasing the pixel count by 2x will almost never result in a 2x performance reduction. Watch him be a brick wall about it like he was with the CPU though, insisting that the PS5 was so often CPU-limited below 60fps when there are only few exceptions such as BG3. You'll present him with a mountain of data and evidence, he'll deflect and won't present his own data, leave the thread, and then spew the same falsehoods elsewhere.

The most extreme cases are those such as Frontiers of Pandora where we see a 3x performance reduction for a 4x pixel count increase. BMW going from 1080p High to 4K high results in exactly a 2x performance reduction for the 4090, going from 138 to 69fps. A mid-range card like the 3070 goes from 63 to 26fps when it should be closer to 15 according to Slimy.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
35% should be fine to increase the resolution significantly (assuming all else stays the same). Just going back to that Alan Wake 2 comparison you can go from 1080p60 to 1440p60 just from going from a 4060ti to a 4070, despite the gap in performance between the two being nowhere near that resolution gap. Changing resolution from 1080p to 1440p (or 900p to 1200p) very rarely requires the 75% performance difference you would need from just the resolution numbers.

Saying ‘oh you need a X% more powerful GPU to increase the resolution by that same X%’ is almost never true, as graphical load and resolution are simply not tied 1:1.
??

4070 is 29 tflops. 4060 Ti is 22 tflops.

AW2 in 1440p runs at 62.8 fps on the 4070. 47.5 fps on the 4060ti's 16 gb variant. 47.2 on the 8gb vairant so we can assume its not vram bound here at this resolution.

62.8/47.5 = 32%
29.15/22.06 = 32%
 

sachos

Member
Dude WTF are devs doing, dont they play test their games to look for differences? So disappointing.
Also in the conclussion Alex says this PSSR issues may be due to low input resolution making PSSR look worse but isnt the input res similar in Performance mode in games like FF7RR showing way better results.
Plus the Pro Quality mode in Jedi Survivor is inputing way higher resolution than base Quality Mode and still has issues, and those resolutions are higher than say TLoU input resolutions while looking much worse.
 
Last edited:

Zathalus

Member
??

4070 is 29 tflops. 4060 Ti is 22 tflops.

AW2 in 1440p runs at 62.8 fps on the 4070. 47.5 fps on the 4060ti's 16 gb variant. 47.2 on the 8gb vairant so we can assume its not vram bound here at this resolution.

62.8/47.5 = 32%
29.15/22.06 = 32%
So a 32% difference in raw GPU power yet the 4070 does roughly the same performance at 1440p that the 4060ti does at 1080p, despite the 75% resolution gap.

So increasing the GPU performance by a mere 32% allows the resolution to be increased by 75%.
 

SlimySnake

Flashless at the Golden Globes
Also in the conclussion Alex says this PSSR issues may be due to low input resolution making PSSR look worse but isnt the input res similar in Performance mode in games like FF7RR showing way better results.
Its not. the game rarely hits the 1200p resolution FF7 Rebirth is upscaling from.

XiS8U3R.jpeg


So a 32% difference in raw GPU power yet the 4070 does roughly the same performance at 1440p that the 4060ti does at 1080p, despite the 75% resolution gap.

So increasing the GPU performance by a mere 32% allows the resolution to be increased by 75%.
That's an odd way to look at it even though ive just proven that performance literally scales 1:1 with GPU power.

You can look at the 4k benchmarks in AW2 to get a better idea because again, CPU might be skewing the lower resolution benchmarks for the GPU. You have the 4k mode running at 35.3 and the 1440p mode running at 64.8. thats 83% performance delta. not 1:1 but not that far off either.
 

Zathalus

Member
Its not. the game rarely hits the 1200p resolution FF7 Rebirth is upscaling from.

XiS8U3R.jpeg



That's an odd way to look at it even though ive just proven that performance literally scales 1:1 with GPU power.

You can look at the 4k benchmarks in AW2 to get a better idea because again, CPU might be skewing the lower resolution benchmarks for the GPU. You have the 4k mode running at 35.3 and the 1440p mode running at 64.8. thats 83% performance delta. not 1:1 but not that far off either.
Performance scales with GPU power usually (but not always either, depends on the game). Performance is not the same thing as resolution though. A 30/50/100 percent jump in resolution doesn’t require a 30/50/100 jump in GPU performance.
 

Gaiff

SBI’s Resident Gaslighter
Performance scales with GPU power usually (but not always either, depends on the game). Performance is not the same thing as resolution though. A 30/50/100 percent jump in resolution doesn’t require a 30/50/100 jump in GPU performance.
Don't bother. We showed this guy this video of GOW on PS4 Pro and he insisted that it ran at 60fps despite almost never reaching that mark.

 

Kangx

Member from Brazile
No, I am saying its not a good test because we literally cant measure the delta. We have other games that let us measure the delta. Literally a dozen of them that show that you cannot go from 800p to 1080p. You are in a thread about a game that has shot by shot comparisons in the video and none of them show a 100% gain in pixels. Let alone the 1200p claims made by the developer themselves.

H2cmVqH.jpeg


This is what DF found.

iwfYNnG.jpeg


Literally one shot that actually goes to 1224p. The rest are identical. You keep asking why the devs wouldnt just increase performance and here is your answer. They literally cant. The DRS has a range of 648p-1224p and it stays at that same exact range because they spent the entire GPU on rendering RTAO and RT reflections. If the GPU was 100% more powerful like you want it to be, we would see at least 50% more pixels here on TOP of the RTAO and RT Reflections effects.

Like many have said here. Increase in resolution is never = 1 to 1 in power.

Other games and the game you listed is have different variables so you can't really compare them 1 to 1 unlike Baldur gate where there are only resolution differences.

Since you listed Jedi-survivor here. I got bad new for you. The base consoles ps5/series x was average around 720p at launch with RT and had a significant increase in resolution with average around 980p and run much better too.

Also the performance mode on the pro is running at flat 60fps. The performance is at a much better state than the base consoles when the game launched back then. This indicates the pro should have even more potential gain in resolution vs the base consoles with dropping RT features.

So what I listed at 1200p on the pro is not impossible but more than likely. it is even conservative too in this situation.

cetJPFc.jpeg
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Like many have said here. Increase in resolution is never = 1 to 1 in power.

Other games and the game you listed is have different variables so you can't really compare them 1 to 1 unlike Baldur gate where there are only resolution differences.

Since you listed Jedi-survivor here. I got bad new for you. The base consoles ps5/series x was average around 720p at launch with RT and had a significant increase in resolution with average around 980p and run much better too.

Also the performance mode on the pro is running at flat 60fps. The performance is at a much better state than the base consoles when the game launched back then. This indicates the pro should have even more potential gain in resolution vs the base consoles with dropping RT features.

So what I listed at 1200p on the pro is not impossible but more than likely. it is even conservative too in this situation.

cetJPFc.jpeg

now you are posting series x benchmarks when DF literally posted PS5 comparisons.
 

sachos

Member
Its not. the game rarely hits the 1200p resolution FF7 Rebirth is upscaling from.

XiS8U3R.jpeg
Oh i remembered wrong. Here is what DF counted for FF7RR "taking the game from an average internal resolution of roughly 1152p to 1224p in my counts to 4K. I got counts as low as 1080p, and as high as 1296p"
I still think the higher than 1440p input res of the Pro Quality mode for Jedi Survivor while still having problems show its not just an input resolution problem, there is something else going on here with PSSR.
 
It'll be like this with every game.

Playstation's days are numbered.
thats your fault for buying an overpriced mid gen product, that does nothing but let Sony be more greedy with the PS6. If people would of rejected this price point and just got a pc instead if they wanted wiz bang graphics then it would of sent a message.

Now here we are. I've been saying it for years now. You can't have top notch ray tracing and settings and expect to get 4k resolution. That type of performance requires a xx80 or xx90 series nvidia card and those are like 1.5k+ in just the graphics card alone.
People need to be fine with 1080p/1440p and realize 4k with top visuals is next gen. We weren't even maxing 1080p last gen. The tv's got pushed and they jumped the gun.
 

SlimySnake

Flashless at the Golden Globes
Oh i remembered wrong. Here is what DF counted for FF7RR "taking the game from an average internal resolution of roughly 1152p to 1224p in my counts to 4K. I got counts as low as 1080p, and as high as 1296p"
I still think the higher than 1440p input res of the Pro Quality mode for Jedi Survivor while still having problems show its not just an input resolution problem, there is something else going on here with PSSR.
its a problem with RT effects as well. We've seen this pixel crawling/shimmering issue in AW2, SH2, Jedi Survivor and Star Wars Outlaws.
 
Top Bottom