• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Immortals of Aveum PS5/Xbox Series X/S: Unreal Engine 5 is Pushed Hard - And Image Quality Suffers

Redneckerz

Those long posts don't cover that red neck boy
Is it actually PS2 quality or are we going full hyperbole? Because from what I remember of PS2 quality, trees did not look that good. Heck I remember FFX being a blurry mess on some landscapes.
Its pretty hyperbole, yes. Well except Transformers: Prelude To Energon exists (Which is a third person game, mind you, but you can play it as an FPS):

10449696-transformers-playstation-2-a-light-unit-decepticlone-guarding-a-.jpg

What amazes me is folks still think there's no need for mid-gen refresh consoles. Not just "I'm not interested" but adamant that there shouldn't be and others shouldn't even get the option...at the end of next year?!
We have had Horizon Forbidden West on PS4 and Quantum Break running on Xbox One with processors that were used in nettops with GPU's stemming back to Radeon HD 7850 ranges. That's just on the OG machines, that still get a lot of (third party) love these days.

If anything IOA tells me one thing:
  • Nanite is a system hog and needs optimization
  • Despite how beautiful it can look, low res kills everything
  • Even with those looks, one could argue that Lumen could have been swapped for pre-baked lighting - Just look at Dead Island 2 and see how good that looks. That's its predecessor's engine, with baked global illumination
So no, i don't think even base consoles are being tapped fully whatsoever yet, and i am frankly intriqued you get so much emoji's on it. This is logic, plain and simple.
This. And let’s not forget about DF’s faces of disgust of even mentioning a PS5 Pro, and saying that it’s completely unnecessary.
Because it is unnecessary. Just because an indie dev can get results quick with UE5 doesn't mean they also can get results good. The game can look great, it definitely can serve as a benchmark graphically - But it does highlight that the new tech needs optimization, and it shouldn't come from Ascendant.
PRO CONSOLES CAUSED A BIG IMPROVEMENT TO EVERY SINGLE GAME LAST GEN SO WHAT ARE YOU TALKING ABOUT?
Not sure if you are joking but the current-gen isn't unlike the previous gen. Last-gen had relatively good GPU's but sub-par CPU's compared to the gen before that (PS36). Current-gen, when launched, had good (not amazing, just good) CPU's and good GPU's. Just a bump from Jaguar to Zen 2 is a galaxy-size improvement.
 

Clear

CliffyB's Cock Holster
Well, that demo wasn't BS per se. And The Matrix Demo, proves that. UE5 would be behind some really good stuff, we just need some really talented devs behind it. Guess that part takes time.

Tech demos are basically parlour tricks; you figure out the illusion and build entirely with the aim of creating that illusion.

You don't have the luxury of that when making an actual game.

For a start off, the amount of resources is drastically reduced due to needing to fit in all the logic and dynamics that a player expects, then of course unless you want the gameplay to run on a rail you can't fake or bodge or around corner-cases where the "trick" doesn't look entirely convincing.

Lumen and Nanite aren't magic, dynamic GI and geometry virtualization make achieving high level of realism "cheaper" than doing them by conventional means, but it doesn't mean they are cost-free! And as I already pointed out, the devs still have to do all the stuff that any other game does on top of accommodating the new tech.

Wake up and smell the coffee! Optimization is not going to result in performance quadrupling. Its going to gain a few percent at best to average frame-time.
 

Mr.Phoenix

Member
Tech demos are basically parlour tricks; you figure out the illusion and build entirely with the aim of creating that illusion.

You don't have the luxury of that when making an actual game.

For a start off, the amount of resources is drastically reduced due to needing to fit in all the logic and dynamics that a player expects, then of course unless you want the gameplay to run on a rail you can't fake or bodge or around corner-cases where the "trick" doesn't look entirely convincing.

Lumen and Nanite aren't magic, dynamic GI and geometry virtualization make achieving high level of realism "cheaper" than doing them by conventional means, but it doesn't mean they are cost-free! And as I already pointed out, the devs still have to do all the stuff that any other game does on top of accommodating the new tech.

Wake up and smell the coffee! Optimization is not going to result in performance quadrupling. Its going to gain a few percent at best to average frame-time.
I agree with everything you are saying, except the optimization part. While I never said or expected quadrupling performance from optimization, I also do not believe this is the best we can and will get.

I will wait until more capable devs actually use the engine.
 

Montauk

Member
Its pretty hyperbole, yes. Well except Transformers: Prelude To Energon exists (Which is a third person game, mind you, but you can play it as an FPS):

10449696-transformers-playstation-2-a-light-unit-decepticlone-guarding-a-.jpg


We have had Horizon Forbidden West on PS4 and Quantum Break running on Xbox One with processors that were used in nettops with GPU's stemming back to Radeon HD 7850 ranges. That's just on the OG machines, that still get a lot of (third party) love these days.

If anything IOA tells me one thing:
  • Nanite is a system hog and needs optimization
  • Despite how beautiful it can look, low res kills everything
  • Even with those looks, one could argue that Lumen could have been swapped for pre-baked lighting - Just look at Dead Island 2 and see how good that looks. That's its predecessor's engine, with baked global illumination
So no, i don't think even base consoles are being tapped fully whatsoever yet, and i am frankly intriqued you get so much emoji's on it. This is logic, plain and simple.

Because it is unnecessary. Just because an indie dev can get results quick with UE5 doesn't mean they also can get results good. The game can look great, it definitely can serve as a benchmark graphically - But it does highlight that the new tech needs optimization, and it shouldn't come from Ascendant.

Not sure if you are joking but the current-gen isn't unlike the previous gen. Last-gen had relatively good GPU's but sub-par CPU's compared to the gen before that (PS36). Current-gen, when launched, had good (not amazing, just good) CPU's and good GPU's. Just a bump from Jaguar to Zen 2 is a galaxy-size improvement.

Yeah, I made similar points myself in the Aveum thread.

These consoles can’t handle what you’re trying to make them do, so why not switch off lumen?
 
Then why the hell are these dumbass developers all doing this? They HAVE to know how bad FSR is when using such pitifully low resolutions yet they do it anyway. All because its the latest trend and theyre too lazy to optomize their games to find a better compromise between graphics settings and image quality.

Now I know why Sony's games are all cross gen and underwhelming. They know that the consoles can't handle next gen engines/next gen graphics better than anyone. But that begs the question ..why didn't design more powerful systems that would be capable of next gen standards? It always comes down to $$. In the ps2/ps3/360 eras the console makers used to take a bigger loss on each unit. Now, the most ive heard is a $50 loss on hardware.
IMO 30 FPS should be the standard for the actual consoles, with native resolutions from 1080p to 1440p, if developers can get a 60 FPS mode too then great but this game, Star Wars, FF16 etc are showing this high FPS mode is starting to suffer now that crossgen era is gone.

But yes i totally agree on your last point about the console makers thing
 

Clear

CliffyB's Cock Holster
I agree with everything you are saying, except the optimization part. While I never said or expected quadrupling performance from optimization, I also do not believe this is the best we can and will get.

I will wait until more capable devs actually use the engine.

I was being hyperbolic to make the point that optimization generally isn't going to be able to claw back that much performance outside of cases where bugs or some other sort of malfunction (like pipeline stalls) are producing results drastically below target. And in my experience I'm not sure how much of that would be internally (as in within a dev context) thought of as being optimization as opposed to debugging or re-engineering.

I admit this may sound like splitting-hairs, but the reality is that the time-scales and risk-factoring involved makes getting the greenlight to undertake such work late in the cycle a big deal. I mean polishing-up versus performing major surgery are two very different processes!

I've said this many times in the past but I think its important to restate that the perspective of those who've been working on their game, and thus have watched it evolve through a multitude of states and conditions, is very different than that of an end-user encountering the same title for the first time.

Their vision for what seems like the best compromise is just going to be different as they've been exposed to more "looks", whereas the gamer just draws comparison with standard/best-in-class in a bunch of generalized metrics; what's the res, the frame-rate, what visual artefacts and blemishes does the game have etc.
 
Last edited:

ArtHands

Thinks buying more servers can fix a bad patch
Thanks.

Well thanks to epic and playstation for fucking us this gen with expectations.

That tech demo was obviously bullshit.

I'm honestly a bit worried about gears and hellblade 2 on console now.

Harsh but not a shock imo. Fanboys were overhyping shits back then, and we’re witnessing dream turning into sad reality here.
 

hussar16

Member
Pro consoles won't save us, when a 4090 runs this game like shit. Even a pro console, instead of base res of 720p, you will get 900p base or maybe 1080p base. But that is with this game, now, not a UE5 game from 2025 that will be able to push it harder.

Current graphics hardware is not good enough to do all the work these engines are capable of doing. It is almost like every UE5 game is the next Crysis. Granted we have all this gimmicky algorithm shit to mask that reality (and Nvidia sees the announcement of more gimmicky algorithm shit as exciting), but that's how it is. In the 1990s we got new GPU hardware that was ~doubling performance every year, but now we are going to have 2-3 years between generations while the hardware makers push algorithms.
when everyone has ben playing games which are just 2012 tech games made to run on 2012 consoles on their beefy pc graphic cards everyone forgot that decade old tech Wil work great and now real 2023 tech ain't easy to run. Same thing happened with crisis and new engine tech back then. To many pc fanboys have been fed easy to run games based on 2012 technology
 
when everyone has ben playing games which are just 2012 tech games made to run on 2012 consoles on their beefy pc graphic cards everyone forgot that decade old tech Wil work great and now real 2023 tech ain't easy to run. Same thing happened with crisis and new engine tech back then. To many pc fanboys have been fed easy to run games based on 2012 technology

You do expect the new tech to at least look better though. When you nerf the base resolution this much it's impossible for the graphics to shine, the reconstruction available on console simply isn't good enough. Target at least 1080p on the big boys people, at least there are a lot of good results on the systems when reconstructing 1080p to 4k.

It's one thing if Vaseline mode was a performance toggle, but to make it the only option. 🤷‍♂️
 
Last edited:

rofif

Can’t Git Gud
720p, 20 years old cinematic.


720p, 17 years old trailer....


720p. Looks better than anything today.
I am just posting this to prove that 720p does not have to be a limiting factor. And these trailers are blurry and compressed.
 

Bojji

Member
720p, 20 years old cinematic.


720p, 17 years old trailer....


720p. Looks better than anything today.
I am just posting this to prove that 720p does not have to be a limiting factor. And these trailers are blurry and compressed.


WTF? You know fixed resolution screens will always scale things to their native test, 1080p looks bad on 4k but 720p looks like shit. That's why we have ai upscaling build in tvs and all those reconstruction tech on pc and consoles. Too bad FSR2 is the weakest of them all, but even the best one (DLSS) won't make 720p game look good on 4K.

Developers have fucked up priorities, who forced them to use nanite and lumen and target 60fps? I'm playing this game on pc right now and I have seen better looking games on last gen consoles (truly), textures and many assets are low quality. They could have baked lighting (no TOD) and increase resolution to 1080p, that with TSR would produce acceptable results.
 
Last edited:

Zathalus

Member
720p, 20 years old cinematic.


720p, 17 years old trailer....


720p. Looks better than anything today.
I am just posting this to prove that 720p does not have to be a limiting factor. And these trailers are blurry and compressed.

CGI videos are rendered at a far higher resolution then 720p and rely a ton on SSAA to enhance the image. The end result is well over 8k and sometimes approaching 16k. I think modern CGI can approach even higher numbers.

720p rendered at that resolution is not the same at all.
 

rofif

Can’t Git Gud
WTF? You know fixed resolution screens will always scale things to their native test, 1080p looks bad on 4k but 720p looks like shit. That's why we have ai upscaling build in tvs and all those reconstruction tech on pc and consoles. Too bad FSR2 is the weakest of them all, but even the best one (DLSS) won't make 720p game look good on 4K.

Developers have fucked up priorities, who forced them to use nanite and lumen and target 60fps? I'm playing this game on pc right now and I have seen better looking games on last gen consoles (truly), textures and many assets are low quality. They could have baked lighting (no TOD) and increase resolution to 1080p, that with TSR would produce acceptable results.
What are you even talking about ?!
This is not scaling we are talking about. I am just proving my point that 720p by itself can look pretty good. Let alone "upscaled" 720p.

1080 or 720p does not look "bad" on 4k displays. This is a senlessly cultivated myth. I had 4k display years ago before integer scaling exist. I was the biggest guy campaigning for integer scaling to exist. Finally 3rd party software and nvidia allowed it to happen and what? and it looks like shit. If You integer upscale 3d content, it will be pixelated. And if you just bilinear stretch (so standard) 720p or 1080p on a 4k screnen, it looks just fine. About the same as it would on a same size native 1080p screen.
I know, I had 1080p and 4k 27" monitors next to each other. Integer scaling only makes sense for pixelart content. All other texture filtered content just looks the best with standard bilinear scaling or any of new tech like fsr2 or dlss2.

Stretching resolution and upscaling resolution are completely different ideas. FSR2 and DLSS2 are not stretchers. They are upscalers.

CGI videos are rendered at a far higher resolution then 720p and rely a ton on SSAA to enhance the image. The end result is well over 8k and sometimes approaching 16k. I think modern CGI can approach even higher numbers.

720p rendered at that resolution is not the same at all.
Exactly my point.
CGi is rendered with so called "ground truth" anti aliasing.
To get the most out of the resolution, You can do a lot of things.
For example, before upscalers existed, I used to downscale from 4k on my 1080p 27" monitor. And guess what? It improved the picture quality tenfold.
Thankfully now these upscalers exist and it's their job to do it cheaper than downscaling and better than just stretching out the image.

FSR2 is just bad at it's job.. I would say the result is still much better than just stretched 720p... unless that 720p would be impeccable before stretching. Which it is not.
 

Bojji

Member
What are you even talking about ?!
This is not scaling we are talking about. I am just proving my point that 720p by itself can look pretty good. Let alone "upscaled" 720p.

1080 or 720p does not look "bad" on 4k displays.

I can't agree with you here. It looks like shit.

Even when I had 1080p tv 720p PS3 games looked like shit on it
 

rofif

Can’t Git Gud
I can't agree with you here. It looks like shit.

Even when I had 1080p tv 720p PS3 games looked like shit on it
but ps3 game soften had raw 720p. No aa or anything. That didn't push possibilities of 720p...
 

Clear

CliffyB's Cock Holster
CGI videos are rendered at a far higher resolution then 720p and rely a ton on SSAA to enhance the image. The end result is well over 8k and sometimes approaching 16k. I think modern CGI can approach even higher numbers.

720p rendered at that resolution is not the same at all.

Not true at all. Sure parts of the clean-up in post use high internal resolutions (Inferno, flame etc.), but in traditional ray-tracing every pixel adds cost so the original image is unlikely to be that big, especially going back a few years when computational grunt was expensive.
 
Last edited:

Zathalus

Member
Not true at all. Sure parts of the clean-up in post use high internal resolutions (Inferno, flame etc.), but in traditional ray-tracing every pixel adds cost so the original image is unlikely to be that big, especially going back a few years when computational grunt was expensive.
The original Toy Story rendered at 1536 × 922 but used quite a bit of SSAA, bumping up the effective resolution quite a bit. I don't think any of those old CGI videos actually used ray-tracing.
 

sinnergy

Member
The original Toy Story rendered at 1536 × 922 but used quite a bit of SSAA, bumping up the effective resolution quite a bit. I don't think any of those old CGI videos actually used ray-tracing.
Yup, Pixar switched pretty late to RT. 2013 Monsters University.
 
Last edited:

Audiophile

Member
Get a gaming PC.
Building a half-decent gaming PC is ~£1500.

By the end of next year if we get a pro and a 'slim' revision + a price drop on the latter by then. We're probs looking at £299-349 for the base system & £479-£549 for a Pro system. There's a vast gulf between that and £1500+.

Not to mention, some PS exclusives don't make it over and those that do come more than a year later. Then there's the PS ecosystem, PSVR2 and a general preference for a straightforward experience (and I say that as someone who know PCs inside out and has built plenty, the console experience is still much more frictionless).


Even with optimisation, stronger devs, more resources and time...a Pro system will be able to do the same thing but with better resolution, visuals and/or performance. I don't get the aversion people have to others having the option of better gaming hardware and subsequent experiences. The Pro isn't going to take away peoples' base PS5s.

Regarding more work for devs, scaling down the XSX to the XSS is a much stronger example of this. If the rumoured PS5 Pro specs are correct, then scaling up a game built first and foremost for the base PS5 should be relatively straightforward and likely welcomed.

As for it being used as a crutch, that's not really an issue of mid-gen hardware if the performance differential is right. There were very few cases of this with PS4 & PS4 Pro, most PS4 games still ran really solid just with a resolution difference between the two. The issue (which we're already seeing here) is devs pushing too many immature featuresets and then misusing reconstruction.
 
Last edited:
To that end, i'd go against the grain and say that i'm actually sort of impressed with how this looks on PS5. Wouldn't have expected a console to be able to run the full UE5.1 feature set (Nanite, Niagara, Lumen), at 60fps* with acceptable image quality. VRR likely takes care of most of the frame drops mentioned in the video, as most fall within the 48-60fps window where VRR technology is effective. It's a 720p base resolution but I don't think it's nearly as offensive looking as people say, at least on PS5 (and is absolutely NOT the PS3 level of image quality we had in that generation).

I suspect that a lot of people's issue with the look of the game is more to do with it artistically being quite ugly lol.
At least on PS5? It runs better on Series X and they are identical apart from the slightly darker gamma ratio on Xbox.
 

Audiophile

Member
I knew those pro consoles were a bad idea back in the day and here we are. People know what consoles to be like smartphones.

They’ve fucked it.

Honestly just let them win. Bring out a new Xbox every year. I’m done talking about this crap.
Not talking about yearly releases or iterative smartphone-like consoles.

I'm still very much in favour of a generational approach (adamantly so), but a single optional upgraded variant at the halfway mark through the gen to provide better resolution, graphics and/or performance on top of what the base system can reasonably offer. But the same platform and fundamental gaming experience.

X1/X1S & X1X had a large performance gulf and so performance fared worse on the base platform. But PS4 & PS4 Pro was a perfectly executed and measured approach; it was very rare for games on the base system to end up subpar. You still got 900-1080p @ 30fps and occasionally 60fps on PS4 while the Pro just ~doubled the pixel output and upscaled to 4K in most enabled titles. Nothing was taken away from the base system and those who wanted a tighter experience had the option.

If PS5 Pro is a bump to the latest version of Zen/RDNA, an extra GPU shader engine, higher clocks and a few custom tweaks. Then we're likely looking at the same again.

Will some devs use it as a crutch? Sure... but they're probably going to use any number of other things as a crutch anyway (like we're seeing with the over-extended reconstruction here). One dev doing a bad job isn't reason enough in my eyes for another capable dev not to have the option to not just provide a solid base experience that fully takes advantage of the base system, but to also be able to provide an upgraded experience that is only possible on mid-gen hardware.
 
Last edited:

Del_X

Member
Needs a 30fps or 40fps mode on all consoles. All this tells me is 30fps UE5 will be the default for “cinematic” story-driven games.
 

Clear

CliffyB's Cock Holster
Yup, Pixar switched pretty late to RT. 2013 Monsters University.

What? Everything Lasseter did (notably from Luxo Jr onward) was ray-traced! The accuracy of lighting and materials handling was always key to Pixar's look, even if the meshes were initially quite simple.

Renderman (their in-house tech) got its first award in 1993!

Ray tracing as a technique has been around a long, long time. It was just prohibitively slow to use initially.
 
Last edited:

Montauk

Member
Not talking about yearly releases or iterative smartphone-like consoles.

I'm still very much in favour of a generational approach (adamantly so), but a single optional upgraded variant at the halfway mark through the gen to provide better resolution, graphics and/or performance on top of what the base system can reasonably offer. But the same platform and fundamental gaming experience.

X1/X1S & X1X had a large performance gulf and so performance fared worse on the base platform. But PS4 & PS4 Pro was a perfectly executed and measured approach; it was very rare for games on the base system to end up subpar. You still got 900-1080p @ 30fps and occasionally 60fps on PS4 while the Pro just ~doubled the pixel output and upscaled to 4K in most enabled titles. Nothing was taken away from the base system and those who wanted a tighter experience had the option.

If PS5 Pro is a bump to the latest version of Zen/RDNA, an extra GPU shader engine, higher clocks and a few custom tweaks. Then we're likely looking at the same again.

Will some devs use it as a crutch? Sure... but they're probably going to use any number of other things as a crutch anyway (like we're seeing with the over-extended reconstruction here). One dev doing a bad job isn't reason enough in my eyes for another capable dev not to have the option to not just provide a solid base experience that fully takes advantage of the base system, but to also be able to provide an upgraded experience that is only possible on mid-gen hardware.

1: No

2: We are not halfway through the gen. You don’t know how long this gen is and the gen spent two years in a ditch covered with dirt because of Covid and everything that caused with working and supply chains.
 

Audiophile

Member
1: No

2: We are not halfway through the gen. You don’t know how long this gen is and the gen spent two years in a ditch covered with dirt because of Covid and everything that caused with working and supply chains.

We're not halfway through the gen, but we probably will be when (and if) a Pro is released.

I think 8yrs would be a fair estimate, but if it somehow ended up being even longer then I'd argue a Pro is even more necessary.

2020 PS5, 2023 PS5 "Slim", 2024 PS4 Pro, 2028/2029 PS6... Though I see MS wanting to get on to the next gen in 2028 given the state of XSS, so I doubt Sony would risk a 2029 plan.
 
Last edited:

intbal

Member
What are the possibilities of 720p?
Well, if he's specifically referring to 7th gen possibilities, probably the first Forza Horizon game.
Full native 720p, with 4xMSAA, and FXAA. Other games might have fancier graphics, but that was probably the cleanest image you could get out of a 720p game on 7th gen hardware.
 

Killer8

Gold Member
At least on PS5? It runs better on Series X and they are identical apart from the slightly darker gamma ratio on Xbox.

They aren't identical though. PS5 appears to be sharper with more detail. Go back to the comparison images posted on page 3 to see what I mean. Tom from Digital Foundry came out and admitted that there is a difference despite both versions using the same base 720p resolution. It's possible PS5 is using some additional processing of the image like AMD's CAS, but the result is a perceptually better looking image either way.
 

rofif

Can’t Git Gud
"1080 or 720p does not look "bad" on 4k displays. This is a senselessly cultivated myth."

rofif rofif

are-you-kidding-me-confused-eddie-murphy-tcin7d71ulfti0wp.gif


I am speaking from experience what I consider to be observable facts. There is very little difference 1080p on 1080p vs on native 4k screen. The difference is way overstated by some and vastly misunderstood.
In pixel art games sure yeah. But in filtered games with aa?! Not at all. You will not see a difference if you play 1080p game on 4K screen or 1080p screen. The difference you see is just because you move to a bigger screen.

When you stretch bilinearly anti aliased and filtered picture, you are not making it look worse. You just make it bigger.
For pixel art games, you will loose 1:1 pixel matching and make them look blurry. Even 1080p games on 4k monitors. No monitor supports built in integer scaler. But nvidia supports it in drivers and most pixel art games support raw 4k anyway.

Listen, I at least know what I am talking about. I am very anal about these things and did a lot of testing. I had about 10 monitors in 2018-2021 period before I got oled.

I don’t want to sound smug or anything but this topic of bilinear or integer scaling, resolutions handling and so on, I am very knowledgeable on. We can talk about this stuff forever.
I might be disgruntled but I know my stuff. Maybe I have a problem communicating with people here but that’s other issue….

Tldr. Don’t mix stretching, bilinear scaling and integer scaling. Pixel games will look terrible stretched even on 1:4 display without integer scaling. 3d antialiased games don’t care due to nature of smoothing pixels.
 
720p, 20 years old cinematic.


720p, 17 years old trailer....


720p. Looks better than anything today.
I am just posting this to prove that 720p does not have to be a limiting factor. And these trailers are blurry and compressed.


The overall image clarity matters the most for sure. Rise Son of Rome is still a looker all these years later. Especially on the 1440p screen I'm using, native 720p can absolutely look sharper than FSR coming from too low of a base resolution, the image will be more aliased but won't have the blurring.
 

Montauk

Member
We're not halfway through the gen, but we probably will be when (and if) a Pro is released.

I think 8yrs would be a fair estimate, but if it somehow ended up being even longer then I'd argue a Pro is even more necessary.

2020 PS5, 2023 PS5 "Slim", 2024 PS4 Pro, 2028/2029 PS6... Though I see MS wanting to get on to the next gen in 2028 given the state of XSS, so I doubt Sony would risk a 2029 plan.

I’m not interested until there’s a PS5 Pro 2, frankly.
 
I played it to completion on PS5 and honestly the softness wasn’t a major issue.

The performance dips I assumed were there are confirmed. I can definitely feel the drops into the 40s but it isn’t often.

Laying on the couch and playing it I had no image quality complaints.

Lumen providing real time lighting to an entire map is definitely noticeable and really neat to see.

tvvzZdZ.jpg

is67M81.jpg
kHJGvvP.jpg
QzQe3lk.jpg
 
Last edited:

Lysandros

Member
I played it to completion on PS5 and honestly the softness wasn’t a major issue.

The performance dips I assumed were there are confirmed. I can definitely feel the drops into the 40s but it isn’t often.

Laying on the couch and playing it I had no image quality complaints.

Lumen providing real time lighting to an entire map is definitely noticeable and really neat to see.

tvvzZdZ.jpg

is67M81.jpg
kHJGvvP.jpg
QzQe3lk.jpg
The resolve of the PS5 version seems to be quite a bit higher than perceived "base 720P" due to whatever 'magic' that DF somehow managed to miss. I think at this point they should just contact the developers and ask about the matter instead throwing layman's guesses if they want to stop looking somewhat inept.
 

CGNoire

Member
What? Everything Lasseter did (notably from Luxo Jr onward) was ray-traced! The accuracy of lighting and materials handling was always key to Pixar's look, even if the meshes were initially quite simple.

Renderman (their in-house tech) got its first award in 1993!

Ray tracing as a technique has been around a long, long time. It was just prohibitively slow to use initially.
Im pretty sure renderman was still rendering via rasterization via REYES then. They may have used raytracing for glass and reflective materials but not lighting.
 
Top Bottom