Digital Foundry: Remnant 2 - An Unreal Engine 5 Nanite Showcase? PS5 vs Xbox Series X/S DF Tech Review

Game looks comparable to 1440p native only when not moving camera guys, don't fool yourself thinking you can get "good" IQ reconstructing from 720p, even dlss can't really do that.
That's always been the issue with image reconstruction but somehow, you have people blatantly ignoring this. And even in the still shots Oliver shows, the native 1440p shot is still clearly better.
 
720p TSR upscaled to 1440p can look equivalent to 1440p in still shots. The problem with TSR (and FSR) is that any heavy motion causes the reconstruction to break down and leads to artifacting.

Lol, no, he didn't say that. He said that in still shots, they look similar, and even then, you can use your eyes to clearly see that the console version is blurrier. It's also better than 1440p using FSR Performance mode which reconstructs from 720p, but FSR sucks at that level, so not surprising.

Image reconstruction tends to fall apart in motion, not in stills.
Ok... so we have the video too that I believe we all watched.

Is the image quality falling apar in motion? If not? Whats the point of what ou guys are saying then?
 
Upscaling, frame generation, all that is fine and well but we're talking about what the machines are able to natively render given their power.

It's hard to isolate things that much though, because the upscaling tech is altering decisions in the design process along the way. If upscaling wasn't an option they would target higher base resolutions and make cutbacks elsewhere.

There are definitely some instances where the final image probably isn't as high quality as it should of been (Dead Space looks better at a lower res without the FSR, etc.), but as long as the final image looks okay it doesn't really matter, IMO.
 
Last edited:
Quality mode 1296p
Balance mode 792p
Perf mode 720p
720P??😂😂
tobey-maguire-glasses-spiderman.gif
 
Ok... so we have the video too that I believe we all watched.

Is the image quality falling apar in motion? If not? Whats the point of what ou guys are saying then?
It does. Every FSR and TSR implementation does this. Just a limitation of temporal reconstruction without an AI pass.
 
Ok... so we have the video too that I believe we all watched.

Is the image quality falling apar in motion? If not? Whats the point of what ou guys are saying then?
Look at the foliage in the first still shots that compare them directly. It's incredibly noisy on the console side, especially the tree leaves on the top left that sways in front of the building and that's just a slight movement. With fast motions and fine details, it'd be even worse. We already KNOW this and we've known this for years so why are you pretending not to?

Oliver said still shots offer similar detail levels, ie, the super upsampling is able to resolve similar amount of details to native...it doesn't mean they're identical, especially not with fine moving objects or fast movements.
 
Last edited:
While everyone is playing spot the difference with the resolution, i'm extremely impressed with Nanite. It's working as intended and has basically vanquished LOD pop-in. Extremely impressive stuff and goes a huge way to make games look less game-y.
 
People seem to forget that Epic said targeted UE5 to run at 1080p 30fps upscaled with TSR. This was to be expected.
And of course, 60fps has to sacrifice even more base resolution.
 
That's always been the issue with image reconstruction but somehow, you have people blatantly ignoring this. And even in the still shots Oliver shows, the native 1440p shot is still clearly better.

Yes, that's probably from mainly console players who can't really test themselves how much loss of IQ there is when reconstructing from super low resolution. And that's the problem, all games should be reconstructed from 1080p to 4K, not fucking 720p...

720p....

720p...

In 2023....Next gen games were 720p...

Yep, that's horrible. I expected 1080p games at worst but this shit is much worse than that LOL.
 
Ue5 and it still looks so ugly.

Are we still saying xsx don't have some bottleneck that's holding it from using its full 12tf.
 
Last edited:
I mean, upscaling does cover a multitude of sins, but it's fair to ask what's happening with the technical execution on some recent games - especially given the fact that game design isn't taking some bonkers leap over the previous generation. Open worlds became a meme last gen, but from a game design standpoint, they were a considerable step over the prior generation - games like RDR2 and HZD ended up looking and running just as good as linear action games, even with heaps of dynamic elements and real-time simulation at play.

Looking at FFXVI and Remnant II, I'm just not seeing where the render budget is being spent. Honestly, if rounding off a few edges and squeezing a few more triangles is costing that much, just bin that shit and spend the budget on some bonkers gameplay systems instead. If you can do TOTK on a tablet, I hardly imagine what you could pull off on PS5/XSX.
 
Last edited:
crazy to think how much shit that console got last gen...and here we are.
To think their CPUs were still based on Athlon engineering, no wonder why we are at this shit today, the PC version barely uses the CPU and it shows the poor performance, developers are still making games 101% GPU bound.
 
Last edited:
I see why they are getting the pro out asap because as more games come out running at these resolutions, plenty of the more enthusiastic gamers are gonna read this shit and go straight to pc.

The ps5 and series X were 100 percent just pro consoles of pro consoles.
 
Last edited:
I am playing it on PC right now and I don't understand why this game hogs resources so much. It's not exactly a looker, just good enough for current gen. Load times are great on M2 SSD, but that's about the most impressive technical feat. I don't even know why Nanite is important here, it's not an open world game, there are no distant objects to really improve the LOD.

Overall this looks more like bad optimization than anything else. 720p on PS5/XSX is an embarrasment.
 
I see why they are getting the pro out asap because as more games come out running at these resolutions, plenty of the more enthusiastic gamers are gonna read this shit and go straight to pc.

The ps5 and series X were 100 percent just pro consoles of pro consoles.
This is the worst reason for pro.
Just because devs are using the console as budget pc, doesn't mean it has to be that way. it is more than a budget pc if they would decide to use it that way.
Besides - 720p or not, the quality mode is 1300p and balance mode is 800p... but it's dynamic and end result is comparable to 1440p anyway.
So just saying that game is 720p because it can sometimes dip to that in 1 mode is not very realistic.

It would be exactly the same with PRO consoles. They wouldn't optimize the shadows or some shit and run down to 720p anyway because it's the easiest performance solution.
Pro will just move the goalpost. We need true care and devs to dig into the bespoke architecture. Anyone can slap a pc game on ps5, set UE graphics preset to medium and be done with it. That's so bad
 
And people keep parroting that we don't really need Pro consoles... UE5 seems to be heavy af with nanite and if you add lumen on top of that, oh boy!

Edit: An no, mid-sized dev studios don't have the experience, man power and talent that Epic Games have at their disposal to optimize UE5 titles as much the latter did with Fortnite.
Then stop using UE5, nanite, and/or lumen. The hardware isn't there yet and the benefits are minor relative to the costs.
 
Last edited:
Shit framerate, shit image quality... looks like this gen is going in the wrong direction.

Most current gen only games have dogshit IQ and performance while barely looking better than last gen games, only some exclusives are above that.
FWIW the IQ is very good on PC, even using DLSS.
 
ouch at that series s result. 1080 output, 30fps and no motion blur. yikes
its 900p upscaled to 1080p, and dips to 26fps on the footage df showed, probably even lower somewhere deeper in the game, dat xss proves to be as awful as i imagined :P
 
its 900p upscaled to 1080p, and dips to 26fps on the footage df showed, probably even lower somewhere deeper in the game, dat xss proves to be as awful as i imagined :p
yeah... I mean at leas tit's 200$ gp box, so anyone can play games if they dont care about "high end"
 
yet there are people saying "WE DON'T NEED A PRO THIS GENERATION!"

They need to gtfoh.
Because the variable at fault is the developer, not the console. Just look at Rift Apart that looks much better and adds RT on top and runs at 1800-2160p, at 30fps+ with the absolute bottom being 1296p.

Clearly, the consoles are capable of much more. You just need someone competent enough.
 
Last edited:
This is the worst reason for pro.
Just because devs are using the console as budget pc, doesn't mean it has to be that way. it is more than a budget pc if they would decide to use it that way.
Besides - 720p or not, the quality mode is 1300p and balance mode is 800p... but it's dynamic and end result is comparable to 1440p anyway.
So just saying that game is 720p because it can sometimes dip to that in 1 mode is not very realistic.

It would be exactly the same with PRO consoles. They wouldn't optimize the shadows or some shit and run down to 720p anyway because it's the easiest performance solution.
Pro will just move the goalpost. We need true care and devs to dig into the bespoke architecture. Anyone can slap a pc game on ps5, set UE graphics preset to medium and be done with it. That's so bad

Well thats the issue (actually a blessing and a curse, imo) with modern consoles. At the end of the day they are just budget PCs with a TV-centric Operating System. Have been since the switch to an x86 architecture.
 
Because the variable at fault is the developer, not the console. Just look at Rift Apart that looks much better and adds RT on top and runs at 1800-2160p, at 30fps+ with the absolute bottom being 1296p.

Clearly, the consoles are capable of much more. You just need someone competent enough.

There's a massive difference between an in-house first party developer with the tremendous advantages of no budget and time constraints, working on their company's platform - versus a much smaller studio creating a multiplatform game.
 
Last edited:
There's a massive difference between an in-house first party developer with the tremendous advantages of no budget and time constraints, working on their company's platform - versus a much smaller studio creating a multiplatform game.
Sure, but that doesn't change what I said. Furthermore, optimization and good performance should be high on the list of priorities. A lot of things should go before you forego optimization but this is clearly not the case.

There's really no excuse besides "it's not that important".
 
Right you guys are gonna have to actually say that it's using FSR or Super Sampling or whatever method because when you guys say 720p we are picturing actual 720p image
 
Another 720p game in performance mode on PS5 just like FF16. I'm sensing a trend here in 2023...
All those new things will push these Consoles into 540p in no time, i fully expect 540p only games by 2027.

Upscaled with great techniques of course to 1080p/1440p.
 
Well thats the issue (actually a blessing and a curse, imo) with modern consoles. At the end of the day they are just budget PCs with a TV-centric Operating System. Have been since the switch to an x86 architecture.
But they much more than just budget pcs. Devs just need to tap into that.
What is true, is that they allow much quicker ports from pc to consoles but if you dig deeper, you can do so much.
This is a common mis concept that architecture is the same as pc. It's different enough
 
720p and can't even hold 60fps.
Man how quickly the dream of this gen being at least 1080p 60fps died.
And this isn't even using Lumen, once we get UE5 games using the full feature set we'll probably see console games running at 480p 25fps.

Honestly I've yet to see a single next gen game with visuals that seem worth the drop in resolution and framerate. We seem to have reached the point where getting a game to look 10% better needs 200% stronger hardware and it's not worth it.
 
Last edited:
Top Bottom