Graphical Fidelity I Expect This Gen

GymWolf GymWolf what did you think of Plague 2's graphics due to being on its own dev engine?

Been getting through that and say its worth giving good respect. Looks damn incredible in Ultra Wide.
Plague 2? The old one?

Some great looking locations, pretty meh characters, good looking AA game that is not gonna win any graphic award unless it's a dry year.
 
Last edited:
Still feels like something off come customization of custom avatars with hair rendering here I can still see it being improved than what it currently is. Either that or devs not fully doing that justice.

Bespoke set characters look damn nice like in Clair Obscur for example but above is rather uncanny. I just feel it can still get better.
Uncanny valley is about animation not visuals. The animation always needs to be further ahead in quality than the visuals to bridge the gap. The better the visuals the more our eyes expect animated perfection.
 

shmQpmb.jpeg


Almost 300k polygons per enemies is actually insane. If it wasn't for UE5 and nanite skeletal meshes I would have no faith the game could look like that. Not sure how the gore system is possible on consoles though.
 
Nope, you are wrong , and you should feel very very bad about it /s

jFGeCVf.png




Collage-Maker-25-Jan-2023-02.28-PM.jpg
you bunch of no lives who can't enjoy anything. I know this is a graphics thread but cmon now.
Death stranding is very impressive and you choose to focus on some bullshit?
SlimySnake SlimySnake show me another ue5 game on ps5 that runs stable 40fps1440p+ with global rtgi and nanite? Because ac shadows does that and you know how good it looks.
So why are you choosing to talk about the performance mode? it is irrelevant

Just get PCs, start modding and stop whining. jesus christ that's so embarrassing.
Cmon now. This is why games take 10 years to make lol and then you are still complaining

And again - attacking me personally makes no sense. Just makes you guys look cheap. you have no arguments so what do you do? oh yeah "forspoken lolololol the guy liked forspoken". this is so low I shouldn't even be responding.
All because of death stranding rocks? In a scene that universally makes everyone go "wow thats amazing" ?
NEOGAF

2LflRDc.jpeg



6bbtV20.jpeg



pdvy14o.jpeg



Copy link
...
FRHuRiH.jpeg
 

shmQpmb.jpeg


Almost 300k polygons per enemies is actually insane. If it wasn't for UE5 and nanite skeletal meshes I would have no faith the game could look like that. Not sure how the gore system is possible on consoles though.

Depends on how many character are on screen at once. Even before virtualised geometry we had car games with 300,000 poly plus vechicles. I think the original trailer for this only had 1 to 2 characters at a time on screen. could be after skeletal nanite was droped that they updated the enemy count.
 

shmQpmb.jpeg


Almost 300k polygons per enemies is actually insane. If it wasn't for UE5 and nanite skeletal meshes I would have no faith the game could look like that. Not sure how the gore system is possible on consoles though.

wow thank you unreal. I really wanted to use 24gb of vram so I this character teeth can have 50k polygons if I look really close.
Can't you guys see this is bullshit? Don't you have any critical thinking?
This is absolutely waste of resources. In a dark horror game. You shoot the guy in a head and move on. To don't noclip on his teeth :P
 
Looks amazing. However, one thing, it really bothers me his shirt doesn't ripple in the wind. Its the little things like that that take devs like R* and ND to another level many other studios just cant reach, despite having arguably better looking games technically
That shit actually pisses me off since even MGS3 on PS2 had rippling clothing . It doesnt even have to be a "real" simulation but the animation has to be there. Shit like that and wind simulation should be standard by now.

At least Ubisoft seems to have recently finally made wind sim standard in there games...there is at least that.
 
plague's 2 is UE4. plauge's 3 will be UE5.

Really? I thought Asobo had it in their own Engine unless that was for just the first, or could have been mislead (or misread). If so it's impressive for UE4.

Still a cool IP trying to punch above its weight given its budget and so fourth. Beautiful, beautiful environments.
 
wow thank you unreal. I really wanted to use 24gb of vram so I this character teeth can have 50k polygons if I look really close.
Can't you guys see this is bullshit? Don't you have any critical thinking?
This is absolutely waste of resources. In a dark horror game. You shoot the guy in a head and move on. To don't noclip on his teeth :P
FSjcQlx.png

TrQSeA4.jpeg

En1k2mj.jpeg

QCaGT12.png

J0cWihh.jpeg
 
Really? I thought Asobo had it in their own Engine unless that was for just the first, or could have been mislead (or misread). If so it's impressive for UE4.

Still a cool IP trying to punch above its weight given its budget and so fourth. Beautiful, beautiful environments.
no idea where the misinformation that asobo uses unreal engine stems from, but they do not use it
they have their own custom engine that they also use for flight simulator actually
 
you bunch of no lives who can't enjoy anything. I know this is a graphics thread but cmon now.
Death stranding is very impressive and you choose to focus on some bullshit?
SlimySnake SlimySnake show me another ue5 game on ps5 that runs stable 40fps1440p+ with global rtgi and nanite? Because ac shadows does that and you know how good it looks.
So why are you choosing to talk about the performance mode? it is irrelevant
I dont know where you are getting your numbers from. You can check them out here.



The PS5 already drops below 1440p in 30 fps mode. In 40 fps mode, the graphics load would increase by 33% which means a similar reduction to the resolution as well. It would be closer to 1080p than 1440p in the 40 fps mode.

And thats ok. We saw something similar in Avatar and star wars outlaws. RTGI is expensive. I'd rather devs try and hit these features than do nothing and target native 4k like sony devs do. The point is that UE5, Snowdrop, Anvil, Northlight, are all offering the same levels of performance. 1440p 30 fps, and sub 1080p 60 fps. only difference is that Anvil turns off RT in the 1080p 60 fps mode where as lumen remains enabled. Albeit software lumen.

If Death Stranding is native 4k like spiderman 2, horizon, ratchet, gow, demon souls and gt7 then its effectively a cross gen game that doesnt push new tech, and simply pushes pixels. Now that is literally wasting the GPU.

If it turns out that its 1440p 30 fps or 4kcb 30 fps, and 1080p 60 fps then fine, that means they are using the GPU for more than just pushing pixels. be it draw distance, better visual effects, more dynamic lighting, more geometry/foliage, more detail etc. I would not be as harsh in that case even if they chose not to invest in mesh shaders or Ray tracing.

I do not want ray tracing in every game. if the lighting looks good like it does in the opening mountain region and the fireworks area then im ok with them saving GPU resources to invest in other areas. Nanite or mesh shaders on the other hand is essentially free, but if the results are ok then i dont care. Star Wars Jedi Survivor, Callisto, Starfield and outlaws do not use mesh shaders and have stunningly detailed assets. So if DS2 can have similar asset quality then i wouldnt mind the lack of mesh shader support either.

That particular screenshot looks really bad and maybe its just a bug but if the game is full of that level of detail then yes, i will bitch about the lack of mesh shader support.
 
Really? I thought Asobo had it in their own Engine unless that was for just the first, or could have been mislead (or misread). If so it's impressive for UE4.

Still a cool IP trying to punch above its weight given its budget and so fourth. Beautiful, beautiful environments.
Just checked. Im wrong, apparently it runs on their own engine like you said.
 
this is a main character in a badly running game.
whats your point?
I dont know where you are getting your numbers from. You can check them out here.



The PS5 already drops below 1440p in 30 fps mode. In 40 fps mode, the graphics load would increase by 33% which means a similar reduction to the resolution as well. It would be closer to 1080p than 1440p in the 40 fps mode.

And thats ok. We saw something similar in Avatar and star wars outlaws. RTGI is expensive. I'd rather devs try and hit these features than do nothing and target native 4k like sony devs do. The point is that UE5, Snowdrop, Anvil, Northlight, are all offering the same levels of performance. 1440p 30 fps, and sub 1080p 60 fps. only difference is that Anvil turns off RT in the 1080p 60 fps mode where as lumen remains enabled. Albeit software lumen.

If Death Stranding is native 4k like spiderman 2, horizon, ratchet, gow, demon souls and gt7 then its effectively a cross gen game that doesnt push new tech, and simply pushes pixels. Now that is literally wasting the GPU.

If it turns out that its 1440p 30 fps or 4kcb 30 fps, and 1080p 60 fps then fine, that means they are using the GPU for more than just pushing pixels. be it draw distance, better visual effects, more dynamic lighting, more geometry/foliage, more detail etc. I would not be as harsh in that case even if they chose not to invest in mesh shaders or Ray tracing.

I do not want ray tracing in every game. if the lighting looks good like it does in the opening mountain region and the fireworks area then im ok with them saving GPU resources to invest in other areas. Nanite or mesh shaders on the other hand is essentially free, but if the results are ok then i dont care. Star Wars Jedi Survivor, Callisto, Starfield and outlaws do not use mesh shaders and have stunningly detailed assets. So if DS2 can have similar asset quality then i wouldnt mind the lack of mesh shader support either.

That particular screenshot looks really bad and maybe its just a bug but if the game is full of that level of detail then yes, i will bitch about the lack of mesh shader support.

ps5 pro. I am talking about my experience. I dont give a fuck about ps5 base anymore
 
this is a main character in a badly running game.
whats your point?

ps5 pro. I am talking about my experience. I dont give a fuck about ps5 base anymore
I started off by saying that the PS5 pro performance mode drops to 864p which is what the performance mode of UE5 games on base PS5 drop to.

you then brought up the 1440p 40 fps mode but you can see that the ps5 pro's quality mode already drops to 1440p in 30 fps mode. your math doesnt add up. the 40 fps mode will drop further than that. it has to.

UE5, Snowdrop, Northlight all perform the same. The cost of ray tracing is what it is. the only issue with UE5 was that hardware lumen had a CPU cost which was bottlenecking the 60 fps modes so they just shipped most games with software lumen. starting from UE5.4, thats no longer an issue.
 
I started off by saying that the PS5 pro performance mode drops to 864p which is what the performance mode of UE5 games on base PS5 drop to.

you then brought up the 1440p 40 fps mode but you can see that the ps5 pro's quality mode already drops to 1440p in 30 fps mode. your math doesnt add up. the 40 fps mode will drop further than that. it has to.

UE5, Snowdrop, Northlight all perform the same. The cost of ray tracing is what it is. the only issue with UE5 was that hardware lumen had a CPU cost which was bottlenecking the 60 fps modes so they just shipped most games with software lumen. starting from UE5.4, thats no longer an issue.
but it's 1440p

Fh3fVqe.png


Can I just post some of my ac shadows screenshots to deescalate the tension?
seriously, if there was ever a game worth playing for graphics, it's this one :P and the screenshots don't capture amazing hdr and realtime viewing with trees swaying etc
No5kmxI.jpeg



L7E6lbT.jpeg



iof5foY.jpeg



2cXrwth.jpeg



mIbEadW.jpeg



FjN4T32.jpeg



P6VLEe0.jpeg



S0m0IIY.jpeg

Cxg4hRL.jpeg
 
Last edited:
wow thank you unreal. I really wanted to use 24gb of vram so I this character teeth can have 50k polygons if I look really close.
Can't you guys see this is bullshit? Don't you have any critical thinking?
This is absolutely waste of resources. In a dark horror game. You shoot the guy in a head and move on. To don't noclip on his teeth :P
it doesnt fillup vram or render all 50k at once. it just zoom proof because of it. The way nanite works is the image doesnt render more than a small hand full of polygons per pixel rendered. loading in and out polys as it goes. having the teeth be 50k or even 50billion polys wouldnt effect how many are stored in vram or rendered at once at all. Thats not how it works.

The original UE5 demo running on PS5 only every rendered or fit into vram 22 million polys even though the scence consisted of over a billion.
 
Last edited:
it doesnt fillup vram or render all 50k at once. it just zoom proof because of it. The way nanite works is the image doesnt render more than a small hand full of polygons per pixel rendered. loading in and out polys as it goes. having the teeth be 50k or even 50billion polys would effect how many are stored in vram or rendered at once at all. Thats not how it works.

The original UE5 demo running on PS5 only every rendered or fit into vram 22 million polys even though the scence consisted of over a billion.
It's a great idea and it's as good of an idea as RT is.
But the first real game I've noticed that has 0 pop in might be ac shadows.
It will probably be a standard from now on. Decima will probably only update to it with horizon 3
 
It's a great idea and it's as good of an idea as RT is.
But the first real game I've noticed that has 0 pop in might be ac shadows.
It will probably be a standard from now on. Decima will probably only update to it with horizon 3
It's a great idea but you only like it when it's not unreal doing it?
 
-----everything-------
*puts out bullshit
- gets corrected
*doubles down on bullshit while making up more bullshit
- gets corrected with more facts
*either pretends unrelated bullshit somehow changes the amount of bullshit or doesn´t accept facts because --insert bullshit like "i have a buddy I trust, my mom said etc"--
- gets called out for trying to ignore verified facts
*either changes topic or pretends to be a victim of bullying in some way

every rofif discussion ever.

Stupidity Are You Stupid GIF

actually.... no need to ask with so much empirical data.

Idiot Reaction GIF
 
Last edited:
The trailer is unreal, which is why I have zero belief the game will look anywhere close to this on release. Atomic Heart was released in early 2023. Are we really to believe the same studio can produce a game of that quality in 2.5 years? Not to mention they are working on a completely new IP as well, which they also had a trailer for
Have you played atomic hearts? For an AA it looked splendid on pc most of the times.
 
but it's 1440p

Fh3fVqe.png


Can I just post some of my ac shadows screenshots to deescalate the tension?
seriously, if there was ever a game worth playing for graphics, it's this one :P and the screenshots don't capture amazing hdr and realtime viewing with trees swaying etc
No5kmxI.jpeg



L7E6lbT.jpeg



iof5foY.jpeg



2cXrwth.jpeg



mIbEadW.jpeg



FjN4T32.jpeg



P6VLEe0.jpeg



S0m0IIY.jpeg

Cxg4hRL.jpeg
Of course, the devs will say its 1440p with drops. They also said 1296p with drops for the 60 fps mode but the actual figures are 864p. Believe your eyes, not dev PR.
 
It's a great idea but you only like it when it's not unreal doing it?
the unreal hate on this board is so silly. People conflate the stuttering issues with performance time and time again. And the shader compilation issues dont even affect consoles so not sure why console gamers complain about it so much. its just traversal stutters and they only happen when crossing invisible loading screens. sony games use retarded crawling sections to hide them. id rather have a spike than a 20 second long shimmying section. or watch joel and ellie hoist each other up ladders for a full minute.

If anything, software lumen is a good compromise because RTGI is clearly expensive as we can see from Anvil which cannot do RTGI at 60 fps on base consoles, and even snowdrop games drop to 720p with RTGI. Having software based solution to fall back on is a good thing. Had ID software used non-RT realtime global illumination, they might have been able to focus on providing a bigger jump to level of detail in the levels. Same goes for Indy which blew its entire rendering budget on RTGI and left some very last gen looking assets in the levels.

Most recent UE5 games built on newer versions are 1440p 30, 1296p 40, and 1080p 60 fps with all the fancy software lumen, nanite and virtual shadow maps features. Expedition 33, Avowed, and even wukong managed to hit those targets. Though Wukong was CPU bound in the 60 fps mode because its on the older UE5.1 version. Its 30 and 40 fps modes are in line with Avatar, AC Shadows, Alan Wake 2, and other major graphics showcases from other engines.
 
the unreal hate on this board is so silly. People conflate the stuttering issues with performance time and time again. And the shader compilation issues dont even affect consoles so not sure why console gamers complain about it so much. its just traversal stutters and they only happen when crossing invisible loading screens. sony games use retarded crawling sections to hide them. id rather have a spike than a 20 second long shimmying section. or watch joel and ellie hoist each other up ladders for a full minute.

If anything, software lumen is a good compromise because RTGI is clearly expensive as we can see from Anvil which cannot do RTGI at 60 fps on base consoles, and even snowdrop games drop to 720p with RTGI. Having software based solution to fall back on is a good thing. Had ID software used non-RT realtime global illumination, they might have been able to focus on providing a bigger jump to level of detail in the levels. Same goes for Indy which blew its entire rendering budget on RTGI and left some very last gen looking assets in the levels.

Most recent UE5 games built on newer versions are 1440p 30, 1296p 40, and 1080p 60 fps with all the fancy software lumen, nanite and virtual shadow maps features. Expedition 33, Avowed, and even wukong managed to hit those targets. Though Wukong was CPU bound in the 60 fps mode because its on the older UE5.1 version. Its 30 and 40 fps modes are in line with Avatar, AC Shadows, Alan Wake 2, and other major graphics showcases from other engines.
some of most impressive games this gen are wukong and ac shadows.
 
yeah, forspoken is way too heavy for a game that looks last gen. but thats pretty much every modern japanese game for you. Monster Hunter Wilds, Dragons Dogma 2, Elden Ring, Rise of Ronin all look mid to last gen, and yet perform like Avatar, Wukong and AC shadows. KojiPro and the RE team at Capcom are the only ones who know what they are doing over there. Although, kojipro might soon be joining the clueless crew if they ship a game with obvious LOD transitions and poor level of detail and lighting. We will find out in 2 weeks.
 

Its such a small thing but it's great and refreshing to see developers say their ambition is to make the best looking game. So many aaa developers talk about 60 fps and lack of processing power nowadays to try and make excuses for their lack of aim ambition. While these guys are like no, we wanted to make the best looking game, sue us.

And yes, its probably just PR but i love this because its a throwback to ps3 era when everyone was publicly talking about trying to top each other. Not this kumbaya bullshit that seeped into the industry in the ps4 era and this complacent hey we will settle for last gen or 60 fps nonsense from Sony first party this gen.
 

The difference between their horse rendering and w4 horse rendering is pretty big for being the same engine...

Edit: well if this trailer teached me something is that using metahumans tech is not always a sure strike, i guess you need the talent to also make incredible models\faces.
 
Last edited:
Ok i rewatched the AH2 trailer and...yeah it looks pretty fucking fake like ill or high on life 2, ngl :lollipop_grinning_sweat:
Some parts looks feasible but some are too perfect to be pure gameplay.

Slimer, convince me that what i saw in ah2 was all ingame:messenger_pensive:
 
Last edited:
Ok i rewatched the AH2 trailer and...yeah it looks pretty fucking fake like ill or high on life 2, ngl :lollipop_grinning_sweat:
Some parts looks feasible but some are too perfect to be pure gameplay.

Slimer, convince me that what i saw in ah2 was all ingame:messenger_pensive:
well, like 80% of it was cutscenes so i cant do that.
 
I know this is the wrong thread but can you guys believe they're charging $80 for Outer Worlds 2? Out of all games that is what they think justifies $80? Is it using UE5 and has a massive budget increase or is this just MS testing their greedy feet or just doing their part to try to speed up the transition to $80 as a standard ...
 
I know this is the wrong thread but can you guys believe they're charging $80 for Outer Worlds 2? Out of all games that is what they think justifies $80? Is it using UE5 and has a massive budget increase or is this just MS testing their greedy feet or just doing their part to try to speed up the transition to $80 as a standard ...
its their only big game this year and they know everyone else is moving to $80 starting with nintendo. Borderlands is rumored to be $80 as well.

Dont be surprised if ghosts is also $80.

every AAA game from MS, Sony, EA, TakeTwo, Acitvision, and Ubisoft will be $80 starting next year.
 
Ok i rewatched the AH2 trailer and...yeah it looks pretty fucking fake like ill or high on life 2, ngl :lollipop_grinning_sweat:
Some parts looks feasible but some are too perfect to be pure gameplay.

Slimer, convince me that what i saw in ah2 was all ingame:messenger_pensive:
Everybody, and I mean everybody, thought that AH1 gameplay trailers were fake, or even that the game was vaporware.

In the end, final product was pretty damn close, if not better.

 
I know this is the wrong thread but can you guys believe they're charging $80 for Outer Worlds 2? Out of all games that is what they think justifies $80? Is it using UE5 and has a massive budget increase or is this just MS testing their greedy feet or just doing their part to try to speed up the transition to $80 as a standard ...
They want to push people towards Gamepass, which will be more tempting if full game ownership is more expensive, it's as simple as that. And they are willing to lose some sales in the process.
 
Everybody, and I mean everybody, thought that AH1 gameplay trailers were fake, or even that the game was vaporware.

In the end, final product was pretty damn close, if not better.


I know dude, i said the same thing to another dude in this very topic, they just look so good :lollipop_grinning_sweat:
 
Everybody, and I mean everybody, thought that AH1 gameplay trailers were fake, or even that the game was vaporware.

In the end, final product was pretty damn close, if not better.


Listen, I'm two minutes in, and its clear there were MASSIVE downgrades between the demo and launch. Doesn't bode well at all for the latest demo they showed, even if the end result was decent first time around.

Looks like a completely different game! Its clear that the early trailer WAS fake.
 
Last edited:
Everybody, and I mean everybody, thought that AH1 gameplay trailers were fake, or even that the game was vaporware.

In the end, final product was pretty damn close, if not better.


You mean the guys who lied about ray tracing being in their game? Lol. They did release the game I give them that, but heavily marketing ray tracing for the game together with Nvidia only to have it completely missing in the final game was a total scam.
 
Last edited:
I am ok with getting these fake trailers again. We went a good 5+ years not even getting bullshit trailers at E3.

If Atomic Heart 2 and Witcher 4 do get downgraded then fine. At least we know those developers tried to push the bar instead of some other devs who literally sit at home all day making excuses.

If this shit can run at 800-1080p at 60 fps on a base PS5 then in 2+ years, im sure it will come very close to this on a PS6 or high end PC.

yWLAEpG.gif


h9mWCCI.gif
 
Damn, her hair looks lovely. How's the hair in movment, does it look good/realistic?
yes. it's some of the best hair I've ever seen in game.
It moves on the wind and with movement all the time but hard to capture in photo mode because it pauses the wind and her hair calms down :P
Some more shots of hair. Some npcs also use it
QY4Guy0.jpeg



FplqXoC.jpeg



lNccUXL.jpeg



tV61GMU.jpeg



4iwJXTD.jpeg



Copy link
...
Pkwy6Vx.jpeg



icon-plus.36ca911e8ed9667ecf70.svg
Add image
 
The trailer is unreal, which is why I have zero belief the game will look anywhere close to this on release. Atomic Heart was released in early 2023. Are we really to believe the same studio can produce a game of that quality in 2.5 years? Not to mention they are working on a completely new IP as well, which they also had a trailer for
Not saying it's not possible, but...

MGS 1 (1998)

metal-gear-solid-metal-gear-solid-1.gif


MGS 2 (2001)

mgs-mgs2.gif


Again, not saying it's not possible, nor am I saying they are Kojima's team. However, there are many more examples of a developer having an upgrade moment like this within a short time span, sometimes even within the same generation, like Uncharted 1 and 2.

I don't see why it can't happen in today's world.
 
I'd be disappointed if TW4 doesn't look better. It's doing a lot of nice stuff, but in 800p Vaseline Mode this is the bare minimum I expected coming into this gen.

That ballpark of visuals in 1440p30 was what I was gunning for. So if they can do it at 60fps while getting that resolution sitting comfortably at the upper end of the bounds (1080p), then doing the same at 30fps should get you into the 1440p region.

As far as I'm concerned, sub-1080p is kind of unacceptable at this point unless you're on Series S. It just looks like awful on a reasonably-sized 4K TV no matter the content or core visual quality; and in the case of the base PS5, you'll only be getting temporal (or temporal+spatial) upsampling, and even going sub-1440p looks like crap with that.

The tech demo looks great in gifs and is ok on more forgiving smaller displays, but once it's up on the big screen it's straining to look at. Plus the DoF is also kinda awful and a lot of the assets in the market take on a kinda "stodgy" look.
 
Last edited:
Top Bottom