• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

good feeling and controlling 30fps games..................for the love of god don’t disable motion blur. It helps so much. You need like 240fps to disable motion blur. Otherwise it only serves to improve animations and panning judder.
30fps feel and control so well that you need heavy vaseline smear on your screen to hide just how good it feels in motion.....
These low FPS numbers were never "fine", they were a compromise, and one that started feeling way worse with ultra low latency displays like OLEDs.
 

SlimySnake

Flashless at the Golden Globes
On pc the game was one of the ue5 games with less stutter, can confirm.

But tbf, i didn't had a terrible experience with sh2r like you did.

I had worse stutter in callisto ps5 than sh2r on pc...
traversal stutter can be mitigated by having a good CPU. My CPU was great up until last year.

With the 5080 rumored to be $1,350, i think i might just do a CPU upgrade this year and ride it out with my 3080 for another 2 years. it was able to run path tracing in black myth just fine until the enemies popped up and started causing frametime spikes. defeating all enemies from an area enemies made it smooth again.

What sucks is that i will have to upgrade my mobo and then my AIO as well since i lost the AMD backplate attachment for it. All in all, looking at a $700 upgrade just for the CPU if i go with the 7800x3d.
 
Last edited:

rofif

Can’t Git Gud
traversal stutter can be mitigated by having a good CPU. My CPU was great up until last year.

With the 5080 rumored to be $1,350, i think i might just do a CPU upgrade this year and ride it out with my 3080 for another 2 years. it was able to run path tracing in black myth just fine until the enemies popped up and started causing frametime spikes. defeating all enemies from an area enemies made it smooth again.

What sucks is that i will have to upgrade my mobo and then my AIO as well since i lost the AMD backplate attachment for it. All in all, looking at a $700 upgrade just for the CPU if i go with the 7800x3d.
700 now way. More. Mobos are stupid expensive now
 

SlimySnake

Flashless at the Golden Globes
30fps feel and control so well that you need heavy vaseline smear on your screen to hide just how good it feels in motion.....
These low FPS numbers were never "fine", they were a compromise, and one that started feeling way worse with ultra low latency displays like OLEDs.
i dont know how true this is. I played Black Myth, and Star Wars outlaws at 30 fps recently with motion blur disabled and didnt need it. HFW a couple of years ago as well.

Maybe ive just built up a tolerance for 30 fps over the years.
This is absolutely correct.
30fps was just fine and allowed to push graphics. Order1886, driveclub, all uncharted and gears games. Tlou2, death stranding, gears of war, all souls games.
Nobody minded 30 a moment ago. Now it’s suddenly unplayable?! Like wtf.
That’s because these idiots played 60fps game once and now think they are too good for 30? I don’t get it. Maybe it’s because they disable motion blur because it’s hip and cool to do so.
I play games my whole life. I am 35 and I played at 15-240fps and had 120hz crt and everything. Frame rate is not end all be all.
If it’s done well, with good deadzones controls, no relying on system vsync etc and with good motion blur, the results can be great.
Talk about order 1886 I just finished. That game at 30fps is more responsive and controls better than Alan wake 2 at 60 on ps5 and I am not joking. Most 30fps modes this gen are awful and laggy.
If you guys launch original uncharted4 version, bloodborne or order 1886, you can see how good 30fps can be.

The only good 30fps mode this gen was ff16. Demons souls 30fps mode was awful.

Devs started listening to these noveu gaming experts who think that after finishing 3 games at 60fps, they can make decisions and demands. at the same time somehow claiming that 700$ is a lot for a ps5 pro while it’s just w mid range you price… insane.
I wish we got good devs with solid vision for their game. Modes make nonsense to me. Just make the game however you think it’s best, offer vrr unlock so next consoles are safe and that’s it.

Other thing is that devs rely too much on built in ue5 features and don’t code to the metal like they used too. Even if ue5 brings good results.
And when original game on in house engine comes like forspoken, it’s gets ignored anyway. And it got a lot of cool features to talk about graphically.
i've always felt sony games felt great at 30 fps. not every developer knows how to get 30 fps right, but the ones that do make them feel very smooth. Ratchet, HFW (after they fixed the HDR flickering issue) and especially spiderman 2 felt very smooth. Demon Souls didnt.

I think input lag is often confused with 30 fps too. RDR2 has this really bad input lag that makes the game feel like shit even in 60 fps.

700 now way. More. Mobos are stupid expensive now
i am not spending anything more than $600 for the mobo combo. ive seen it go on sale at microcenter so will just wait for another sale. but yes, buying it separately would be well over $700. especially since i need wifi and bluetooth built in.
 

GymWolf

Member
traversal stutter can be mitigated by having a good CPU. My CPU was great up until last year.

With the 5080 rumored to be $1,350, i think i might just do a CPU upgrade this year and ride it out with my 3080 for another 2 years. it was able to run path tracing in black myth just fine until the enemies popped up and started causing frametime spikes. defeating all enemies from an area enemies made it smooth again.

What sucks is that i will have to upgrade my mobo and then my AIO as well since i lost the AMD backplate attachment for it. All in all, looking at a $700 upgrade just for the CPU if i go with the 7800x3d.
Was the raytracing in wukong really path tracing, like the exact same tech of cp and indy? I thought it was just generic rtgi or some shit.
 

SlimySnake

Flashless at the Golden Globes
Was the raytracing in wukong really path tracing, like the exact same tech of cp and indy? I thought it was just generic rtgi or some shit.
yep.

Full ray tracing is a demanding but highly accurate way to render light and its effect on a scene. Also known as Path Tracing, this advanced ray tracing technique is used by visual effects artists to create film and TV graphics that are indistinguishable from reality, but until the arrival of GeForce RTX GPUs with RT Cores, and the AI-powered acceleration of NVIDIA DLSS, full ray tracing in real-time video games was impossible.
In Black Myth: Wukong, full ray tracing increases the fidelity and quality of lighting, reflections and shadows. Reflections on water reflect all surrounding detail. Water caustics add further realism, accurately rendering the refraction and reflection of light. Particles are reflected, making battles more dynamic and exciting. Fully ray-traced lighting ensures lighting indoors and outdoors is pixel perfect, darkening areas where light is occluded or doesn’t reach, and realistically illuminating the world by bouncing light. And in concert with the lighting system, contact hardening and softening fully ray-traced shadows are cast everywhere, rendering the smallest of shadows from leaves and pebbles, and those from geometry-rich buildings, the main character, and the gigantic bosses that must be overcome.

Full Resolution Multi-Bounce Ray-Traced Indirect Lighting

With multi-bounce ray-traced indirect lighting, natural colored lighting bounces up to two times throughout the world of Black Myth: Wukong, creating more realistic indirect lighting and occlusion. Rendering techniques including Screen Space Reflections, Screen Space Ambient Occlusion, and the existing GI solutions were replaced by a single, unified algorithm that delivers more accurate lighting of scenes and objects, making each scene as immersive as possible.

 

GymWolf

Member
yep.



I tried multiple times with it on or off and barely noticed the difference :lollipop_grinning_sweat:, maybe i was just unlucky to try in parts of the game were the difference was minimal...
I don't even remember being that heavy like pt in cyberpunk.

At least in cyberpunk i can immediately tell that everything is darker and different...
 
Last edited:
i dont know how true this is. I played Black Myth, and Star Wars outlaws at 30 fps recently with motion blur disabled and didnt need it. HFW a couple of years ago as well.

Maybe ive just built up a tolerance for 30 fps over the years.
considering how this is always a point of discussion this may very well be a question of comparability and adaption.
As someone who mainly plays PC at minimum 60 fps I find playing at 30 fps like walking through molasses. And since I switched to OLED at the beginning of this gen this got much...much worse since panning now feels like the game is heavily lagging due to the elimination of LCD smear, or rather I now have the choice between very choppy camera movement or smeary motion blur which I both ->absolutely<- loathe.
Maybe I´d not have this issue if I only played @30, but at least switching back and forth is an absolute no go.
 
Last edited:

hinch7

Member
Absolute bullshit. Go play some good 30fps games like bloodborne, original uc4 or order 1886 and tell me it feels or controls like a slide show… there are ton of good feeling and controlling 30fps games. Just not this gen.
Just get a pc if you are whining about 30fps on consoles. You created this problem.
40fps modes are not a good thing. This is a solution to a problem that shouldn’t exist. Bloodborne is more responsive at 30 than wukong at 40. And it looks smoother too despite some frame pacing.
And for the love of god don’t disable motion blur. It helps so much. You need like 240fps to disable motion blur. Otherwise it only serves to improve animations and panning judder.

I would personally prefer 4k patch rather than 60fps patch for bloodborne if it was the only way
I played all of those and 30fps feels like slop compared to 60fps. I replayed UC4 with patch on PS5 on performance mode and its a night a day difference. On TV there's also added input lag as well, even with game mode. Not to mention input lag from console and controller (DualSense and DS4 is 300hz iirc) which will add several ms of latency. DF did some testings and the average input lag is around 100ms end to end on console at 60fps. At 30 thats going to be even higher.

40fps frametimes when it comes to latency is exactly in between, 30 and 60. And requirements to reach that target are much lower than that of 60. I play all my games locked at 40fps and hz on the Steam Deck and its a great experience. Faster display + inputs and way less power needed; thus longer battery life.
 
Last edited:

rofif

Can’t Git Gud
I played all of those and 30fps feels like slop compared to 60fps. I replayed UC4 with patch on PS5 on performance mode and its a night a day difference. On TV there's also added input lag as well, even with game mode. Not to mention input lag from console and controller (DualSense and DS4 is 300hz iirc) which will add several ms of latency. DF did some testings and the average input lag is around 100ms end to end on console at 60fps. At 30 thats going to be even higher.

40fps frametimes when it comes input lag is exactly in between, 30 and 60. And requirements to reach that target are much lower than that of 60. I play all my games locked at 40fps and hz on the Steam Deck and its a great experience. Faster display and inputs and way less power needed and thus longer battery life.
Nobody argues here thatn 40/60/120 isn't better than 30. Of course it is.
We are just arguing that 30 is worth it for graphics push ... if that 30 is done correctly.
Of course in direct comparison A/B, 30fps feels bad. but in a vacuum, if you just play 30fps games? it's fine.
I am playing wukong 40fps not and order 1886 controller more responively... But other 40fps games are fine. i like that mode
 
Last edited:

GymWolf

Member
I tried to replay bloodborne recently and somehow the 30 fps there feel worse than something like spiderman, the game doesn't even have locked 30...
 
Last edited:

diffusionx

Gold Member
I played all of those and 30fps feels like slop compared to 60fps. I replayed UC4 with patch on PS5 on performance mode and its a night a day difference. On TV there's also added input lag as well, even with game mode. Not to mention input lag from console and controller (DualSense and DS4 is 300hz iirc) which will add several ms of latency. DF did some testings and the average input lag is around 100ms end to end on console at 60fps. At 30 thats going to be even higher.

40fps frametimes when it comes to latency is exactly in between, 30 and 60. And requirements to reach that target are much lower than that of 60. I play all my games locked at 40fps and hz on the Steam Deck and its a great experience. Faster display + inputs and way less power needed; thus longer battery life.
obviously, 40fps or 60fps is objectively superior to 30fps. But we played 30fps games for like.... almost 15 years basically exclusively. And somehow we didn't die and loved a lot of the games. Well made 30fps with consistent frametimes and smart use of motion blur can look great and play well.

And even when it comes to "going back", it's really not that big of a deal. I played Sekiro which is a 60fps game, and then went to Bloodborne, which actually isn't even a great 30fps, and you just get used to it pretty quickly. I would prefer if devs target higher frame rates these days, but they do.
 
Last edited:

hinch7

Member
Nobody argues here thatn 40/60/120 isn't better than 30. Of course it is.
We are just arguing that 30 is worth it for graphics push ... if that 30 is done correctly.
Of course in direct comparison A/B, 30fps feels bad. but in a vacuum, if you just play 30fps games? it's fine.
I am playing wukong 40fps not and order 1886 controller more responively... But other 40fps games are fine. i like that mode
Given the choice people will sacrifice some visual fidelity to reach higher framerates for better playability. If we go by Cerny's PS5 Pro presentation the majority that they polled prefered performance (and modes). Gameplay and feedback > some extra fidelity. If there wasn't a need for this the PS5 Pro wouldn't have launched.

Granted there are exceptions to that, like GTA where Rockstar are pouring several hundreds of millions in development to push the technical front of what is possible with closed hardware. And where you can't really expect them to deliver 60fps given how crazy the scope is for that game.
 

GymWolf

Member
Given the choice people will sacrifice some visual fidelity to reach higher framerates for better playability. If we go by Cerny's PS5 Pro presentation the majority that they polled prefered performance (and modes). Gameplay and feedback > some extra fidelity. If there wasn't a need for this the PS5 Pro wouldn't have launched.

Granted there are exceptions to that, like GTA where Rockstar are pouring several hundreds of millions in development to push the technical front of what is possible with closed hardware. And where you can't really expect them to deliver 60fps given how crazy the scope is for that game.
I straight up hope gta6 is shit solid 30 with lows into the 20s (maybe just 30 stable on pro)

That's how i know they are using these consoles to 120%, if the game has a performance mode it means they left city simulation, graphic and physics on the table to accomodate an higher framerate.
 

rofif

Can’t Git Gud
Given the choice people will sacrifice some visual fidelity to reach higher framerates for better playability. If we go by Cerny's PS5 Pro presentation the majority that they polled prefered performance (and modes). Gameplay and feedback > some extra fidelity. If there wasn't a need for this the PS5 Pro wouldn't have launched.

Granted there are exceptions to that, like GTA where Rockstar are pouring several hundreds of millions in development to push the technical front of what is possible with closed hardware. And where you can't really expect them to deliver 60fps given how crazy the scope is for that game.
The problem is that right now, yes - it is sacrificing a little bit of visual clarity for 40 or 60fps.
So the tradeoff seems rather small and worth it.

But that's onyl because these games are designed with 60fps in mind. not 30.
Try and imagine uncharted4, tlou2 or order1886 running on ps4 at 30fps... and then what is needed to get these to run at 60.
It's not just low res 60fps mode. The whole games would have to be changed a lot. Even 30fps modes.
 

H . R . 2

Member
I straight up hope gta6 is shit solid 30 with lows into the 20s (maybe just 30 stable on pro)

That's how i know they are using these consoles to 120%, if the game has a performance mode it means they left city simulation, graphic and physics on the table to accomodate an higher framerate.
...and Series S says hello
 

hinch7

Member
I straight up hope gta6 is shit solid 30 with lows into the 20s (maybe just 30 stable on pro)

That's how i know they are using these consoles to 120%, if the game has a performance mode it means they left city simulation, graphic and physics on the table to accomodate an higher framerate.
True, I expect Rockstar to push these consoles to the limit just like they did GTA V and RDR 2. Yeah part of what makes GTA so special is the physics, interactivity in the world and attention to detail which makes the games so unique. The hair physics alone on that scale; in an open world enviroment, already looks mighty impressive. I'd rather them focus on the simulation side tbh.

Getting so hyped for GTA VI lol. Been too long since I've been wow'd over a game. Playing CP2077 with Overdrive patch was the last one to do that for me. Hopefully this will do the same.
 
Last edited:

H . R . 2

Member
GPU- and CPU-heavy games that are built from the ground up with visual fidelity and a proper amount of simulation in mind almost always end up 30fps experiences
and this would have remained the case for another generation or two until the technology caught up [had the industry not decided to move in a different direction i.e. 60 fps]

it is much easier to put off engine enhancements, and completely ignore complex simulations in favour of 60 fps.
all you need to do is downgrade and slap the fancy label of 'scalability' on top of it.

DF and critics of that ilk who started attacking games [that actually pushed the envelope]
on the grounds of inconsistency of performance at launch, literally ruined the industry

companies realised it is far easier to achieve 60fps, avoid criticism, and appease the 'inferior' console
gamers [Vs. PCMR] than to throw money at their proprietary engines in the hope of satisfying gamers and their ever-growing expectations.

EA dialed back large-scale destruction after BF3, heavily downgraded it after BF 4 and completely ditched it after BF1
Ubisoft never truly innovated the way it did with Unity and downgraded the AC games so much after
that I thought Syndicate was broken on my PC it looked so incredibly ugly

those who push the 60fps agenda while complaining about the lack of innovation or visual fidelity in games need to create a separate thread called
THE PERFORMANCE I EXPECT THIS GEN

because the fact is the console tech is not there yet [to achieve both photorealistic visuals/ impressive simulation AND 60fps]
and regardless of how much we go on and on about the merits of 60fps [which I am sure no one here denies]
the reality is the games released so far have failed to achieve this and the ones expected to impress [kind of !]
are not going to be released for another 2-3 years [practically next-/cross-gen territory] and they have yet to prove themselves

so if you are not willing to make sacrifices then you might as well keep playing the graphics king of the year, Indian Jones
 
Last edited:

kevboard

Member
EA dialed back large-scale destruction after BF3, heavily downgraded it after BF 4 and completely ditched it after BF1

BF3 didn't have large scale destruction. it had canned precomputed animations that played when certain spots of buildings were damaged.

there was zero physics simulation happening, and therefore there was zero physics to scale back.

physics interactions became less common as CPUs became more powerful. only games like Control actually doing it, but that game in particular almost exclusively uses it as a cosmetic flair and not a real gameplay element.

the real reason we don't see actual interactivity and meaningful physics and destruction is the modern AAA game design philosophy of everything being simplified and streamlined.

can't have a fully scripted and half automated gameplay "experience" when the player has too much agency. I mean fuck, God of War doesn't even let you jump anymore outside of context sensitive jump spots, which makes designing levels so much simpler if every move the player can make is either simply walking or context sensitive interaction.

this simplified game design and the lack of interest by AAA devs to make anything actually innovative or interesting and dynamic is the reason everything is as static as a PS1 game these days.
 
Last edited:

H . R . 2

Member
BF3 didn't have large scale destruction. it had canned precomputed animations that played when certain spots of buildings were damaged.

I am aware, I am not talking about Levolution
there was zero physics simulation happening, and therefore there was zero physics to scale back.

physics interactions became less common as CPUs became more powerful. only games like Control actually doing it, but that game in particular almost exclusively uses it as a cosmetic flair and not a real gameplay element.

still, I do not consider this a logical or fair comparison, the destruction in BF:BC2, a multiplayer game from 2 gens ago [however precomupted] on 360 was far more impressive than Control 2's surface-level destruction

this simplified game design and the lack of interest by AAA devs to make anything actually innovative or interest is the reason everything is as static as a PS1 game these days.
I'd like to hear your explanation as to why the industry ended up like this
I have said this many times, that I don't consider 60fps as the only culprit but it is nonetheless one of the main reasons why the industry has stopped the push for innovation and fidelity
in addition to that, I for one consider the shortage of talent, the stagnation especially post covid, and over-commercialisation of the gaming industry which caused a surge in the number of casual gamers as the main reasons
 
Last edited:

kevboard

Member
I am aware, I am not talking about Levolution

BF3 didn't have much else tho... especially compared to modern BF.

still, I do not consider this a logical or fair comparison, the destruction in BF:BC2, a multiplayer game from 2 gens ago [however precomupted] on 360 was far more impressive than Control 2's surface-level destruction

well, BF:BC2's destruction ran perfectly fine at 60fps on PC CPUs that were less powerful than what the Xbox One and PS4 had


I'd like to hear your explanation as to why the industry ended up like this
I have said this many times, that I don't consider 60fps as the only culprit but it is nonetheless one of the main reasons why the industry has stopped the push for innovation and fidelity
in addition to that, I for one consider the shortage of talent, the stagnation especially post covid, and over-commercialisation of the gaming industry which caused a surge in the number of casual gamers as the main reasons

in my opinion the reason is the "cinematic" focus of modern AAA games.
in order to be "cinematic" a game needs to almost control the player. imagine if you destroyed a piece of environment that the developers wanted to use in a "cinematic" sequence a few moments later.

this approach to game design also MASSIVELY simplifies development. and with rising development costs due to the constant push for more realistic graphics and movie like cutscenes, saving cost in other ways makes sense... it's far easier to sell a game through flashy graphics and cutscenes to the general casual audience than it is with innovative gameplay.

it's not a surprise to me that Nintendo has by far the most interesting use of real time physics of the last decade, and doing so on a mobile chip from 2016.
Nintendo is a game design first company. they go even so far as to sometimes coming up with a game design idea first, before even attaching it to an IP.
Sony, Microsoft, EA... they are the extreme opposite of that. game design seems to often come last.
 
Last edited:

H . R . 2

Member
well, BF:BC2's destruction ran perfectly fine at 60fps on PC CPUs that were less powerful than what the Xbox One and PS4 had

I could argue that EA insisted on a higher player count ON TOP OF 60 fps for BF4 onwards, causing the series to lose its trademark destruction
it's far easier to sell a game through flashy graphics and cutscenes to the general casual audience than it is with innovative gameplay.

we are not even getting that
I can't think of a non-linear game in recent memory that pushed visual fidelity and innovative gameplay in tandem , or even just graphics but on a larger scale than HB2

so do people still think they can have 60 fps and innovation and visual fidelity all at the same time this gen?
 

kevboard

Member
so do people still think they can have 60 fps and innovation and visual fidelity all at the same time this gen?

well, the issue is the visual fidelity is being pushed, but in a way that tanks performance while not looking good enough to justify that bad performance.

Nanite tanks performance, Lumen tanks performance, bad optimisation due to bloated engines tanks performance.
just layers of shit tech that is in theory "cutting edge" but in practice not worth it.

the truth is that there simply is no motivation on the side of AAA devs to push for physics and systemic interaction driven game design.
it's harder to design, harder to develop, and especially harder to do competently.

the Switch's hardware can run Zelda at 60fps btw.
it's the most physics driven game in recen history, and if you circumvent Nintendo's downclocks the game reaches 60fps.
this is again hardware meant for tablets from 2016. there simply is no hardware limit excuse for any game that runs on Zen2 and RDNA2 based consoles with nearly 20 times the GPU grunt and multiple times the CPU power of that tablet hardware.
 

H . R . 2

Member
well, the issue is the visual fidelity is being pushed, but in a way that tanks performance while not looking good enough to justify that bad performance.

Nanite tanks performance, Lumen tanks performance, bad optimisation due to bloated engines tanks performance.
just layers of shit tech that is in theory "cutting edge" but in practice not worth it.
most UE5.1 titles were admittedly taxing even on the latest hardware but the majority of those games were perfectly playable
if it were simply 'not worth it' we wouldn't have seen most developers switch to UE5
also UE5 made next-gen possible- if for now at a cost. there are not many engines out there capable of doing that.
also, we have yet to see any UE5.5 games as most of the issues you mentioned above have reportedly been addressed
 

kevboard

Member
most UE5.1 titles were admittedly taxing even on the latest hardware but the majority of those games were perfectly playable
if it were simply 'not worth it' we wouldn't have seen most developers switch to UE5
also UE5 made next-gen possible- if for now at a cost. there are not many engines out there capable of doing that.
also, we have yet to see any UE5.5 games as most of the issues you mentioned above have reportedly been addressed

well, I don't know a single UE5 game that looks on par with the best of last gen. not at 60 nor at 30fps
 

SlimySnake

Flashless at the Golden Globes
GPU- and CPU-heavy games that are built from the ground up with visual fidelity and a proper amount of simulation in mind almost always end up 30fps experiences
and this would have remained the case for another generation or two until the technology caught up [had the industry not decided to move in a different direction i.e. 60 fps]

it is much easier to put off engine enhancements, and completely ignore complex simulations in favour of 60 fps.
all you need to do is downgrade and slap the fancy label of 'scalability' on top of it.

DF and critics of that ilk who started attacking games [that actually pushed the envelope]
on the grounds of inconsistency of performance at launch, literally ruined the industry

companies realised it is far easier to achieve 60fps, avoid criticism, and appease the 'inferior' console
gamers [Vs. PCMR] than to throw money at their proprietary engines in the hope of satisfying gamers and their ever-growing expectations.

EA dialed back large-scale destruction after BF3, heavily downgraded it after BF 4 and completely ditched it after BF1
Ubisoft never truly innovated the way it did with Unity and downgraded the AC games so much after
that I thought Syndicate was broken on my PC it looked so incredibly ugly

those who push the 60fps agenda while complaining about the lack of innovation or visual fidelity in games need to create a separate thread called
THE PERFORMANCE I EXPECT THIS GEN

because the fact is the console tech is not there yet [to achieve both photorealistic visuals/ impressive simulation AND 60fps]
and regardless of how much we go on and on about the merits of 60fps [which I am sure no one here denies]
the reality is the games released so far have failed to achieve this and the ones expected to impress [kind of !]
are not going to be released for another 2-3 years [practically next-/cross-gen territory] and they have yet to prove themselves

so if you are not willing to make sacrifices then you might as well keep playing the graphics king of the year, Indian Jones
I thought Indy wouldve been the game that made people realize the cost of 60 fps on consoles. A game that barely looks last gen, and yet targets 1800p 60 fps. Given graphics of the year by a tech channel.....

I was with DF when they would point out games performing in the low 20s. But most games last gen were 30 fps locked. Especially the big ones like TLOU2 and RDR2. Locked 30 fps became a thing when devs implemented DRS systems in place. But DF chasing clickbait headlines moved to the next big thing. 60 fps performance. Like use your brain. It's struggling because its pushing the visuals unlike other games. Who gives a shit if cross gen games ran at 60 fps. You should be the ones educating people on why the games are struggling to run at high resolutions and 60 fps. Your videos should point out just why its so goddamn expensive.

But nope, they NEVER compare games to other games in their reviews. I'd like some side by side comparisons of next gen games to last gen games. Everyone seems to think TLOU2 and RDR2 are still the best looking games but one comparison to hellblade 2 and Black Myth would easily show why its so goddamn expensive. Maybe I will do it.
 

SlimySnake

Flashless at the Golden Globes
Fortnite, Silent Hill 2, Wu Kong... pretty sure most of them.

and the best one here is Fortnite 😭
every other UE5 game is low resolution, full of artifacts, shitty lumen boiling, light leaking through shit, lighting adjusting super slowly to changes... it's horrific to look at all of it.
Console? Or PC?
 

kevboard

Member
Console? Or PC?

mostly console, but SH2 on PC only and Fortnite on... well name the platform and I played it on it.

SH2 on PC needed me manually adjust the configuration file and installing a mod to be presentable and run better at the same time.

it is still plagued by traversal stutters of course no matter what.
 

setoman

Member
lol, to me Horizon machines look overdesigned as f*ck

It's impressive in terms of quantity of detail, but it's too busy for my taste.

Intergalatic robot look fine. In terms of geometry detail, you dont notice cutbacks in the rounded parts.

Looks exactly what designers wanted to make in terms of design for that world. Maybe something busier wouldnt fit the aesthetic.
Nah there are definitely other directions they could have took to make the game actually look next gen.
For example below.

maxresdefault.jpg
 

SlimySnake

Flashless at the Golden Globes
mostly console, but SH2 on PC only and Fortnite on... well name the platform and I played it on it.

SH2 on PC needed me manually adjust the configuration file and installing a mod to be presentable and run better at the same time.

it is still plagued by traversal stutters of course no matter what.
Then honestly i dont know what you could be missing. When i played HB2, Black Myth and SH2, i saw a clear generational leap. Mostly thanks to the stunning photorealistic lighting style thats missing from top tier games from last gen and even this gen. Made possible by lumen and nanite.

In fact, im playing plague's tale right now and my god its gorgeous. Really high quality assets too. But there is something missing. its hard to say but its the photorealistic art style of UE5 game that is lacking here. it looks gamey and while the assets are really high quality, UE5's nanite has got that edge that makes it look cross gen despite being next gen only, and really pushing the consoles hard.

SH2 stuttering is awful on PC, but thats on bloober. Black Myth and Hellblade 2 have no such stutters. Future UE5 games wont have it as long as they built their game on UE5.4.
 

setoman

Member
Who knows what the future holds but i was very disappointed when UE5 games simply ignored the chaos physics Epic had demo'd in the matrix demo. The engine has support for this ever since UE4. Devs just dont want this shit in their games.

Notice how the cars that get nicked by my car have accurate physics applied to them even after the initial collision with my car. They move and get stopped by the polls behind them. Some even have their windows and headlights break.
And this was before they integrated nanite to their collision engine.
I wouldn't blame developers when it comes to UE5 chaos physics. Chaos performance is terrible.
Nanite character models are needed BADLY. Every time I see a character model during current gen games, I’m reminded of PS4 era games, even some parts of Hellblade 2 when the character models are lower LOD during gameplay. The last gen hair (Dragon Age Veil guard has next gen hair) and low poly models have to go. I also need the density and volumetrics seen in trailers like the Witcher 4 cinematic. The density and visual features in these CGI trailers need to translate to realtime ASAP, only Hellblade 2 dropped my jaw as it felt like playing CGI with CGI quality assets, I got the prerendered trailer feeling from Hellblade 2. We need more games to look like Marvel 1943 fidelity wise…

Its not really the characters that is the problem, its the shading. Skin shading and the entire character rendering is expensive. So you will still run into the same bottleneck.

So while yes you will be able to push the triangles of characters. It will still be equally expensive to run the skin shading, eye shading, hair shading, etc on all those characters. So its not like you will have a 100 of these below. Sure they will have the same LOD, but they won't all look like this.

DakOvq3WAAAvdg0


Siren_Still_02.png
 

SlimySnake

Flashless at the Golden Globes
Nah there are definitely other directions they could have took to make the game actually look next gen.
For example below.

maxresdefault.jpg
yep its too basic. really uninspired robot design, and the lack of detail on the robot doesnt help either.

but they are going for that retro star wars aesthetic which is designed to look rather basic.
 

kevboard

Member
Then honestly i dont know what you could be missing. When i played HB2, Black Myth and SH2, i saw a clear generational leap. Mostly thanks to the stunning photorealistic lighting style thats missing from top tier games from last gen and even this gen. Made possible by lumen and nanite.

In fact, im playing plague's tale right now and my god its gorgeous. Really high quality assets too. But there is something missing. its hard to say but its the photorealistic art style of UE5 game that is lacking here. it looks gamey and while the assets are really high quality, UE5's nanite has got that edge that makes it look cross gen despite being next gen only, and really pushing the consoles hard.

SH2 stuttering is awful on PC, but thats on bloober. Black Myth and Hellblade 2 have no such stutters. Future UE5 games wont have it as long as they built their game on UE5.4.

I value temportal stability over basically anything else, and when it comes to temporal stability there has been a clear step back from the best last gen games.

I honestly couldn't care less about realistic lighting, a pleasing final image is to me much more important.
Lumen especially is just DOGSHIT, like legit worse than Xbox 360 era lighting. what use is "realism" if this realism comes with a massive side dish of RT boiling, constant ghosting, constant like leakage, and very noticeable and slow accumulation?

I'd take prebaked lighting ala Mirror's Edge over literally any Lumen implementation on the market so far.
P56Qyvx.jpeg
y2QP0i7.jpeg
cAWpNaj.jpeg


and the prebaked GI and shadows in Mirror's Edge were massively limited by the memory sizes at the time. imagine this game targeting a console with 16GB of memory... the quality of the prebaked GI and shadows would destroy Lumen even more
 
Last edited:

SimTourist

Member
The problem is that right now, yes - it is sacrificing a little bit of visual clarity for 40 or 60fps.
So the tradeoff seems rather small and worth it.

But that's onyl because these games are designed with 60fps in mind. not 30.
Try and imagine uncharted4, tlou2 or order1886 running on ps4 at 30fps... and then what is needed to get these to run at 60.
It's not just low res 60fps mode. The whole games would have to be changed a lot. Even 30fps modes.
You can check out how Uncharted 4 60 fps mode would look like by playing the multiplayer which runs at 900p and 60 fps and still has much simpler graphics than SP mode.
 
Last edited:

GymWolf

Member
But nope, they NEVER compare games to other games in their reviews. I'd like some side by side comparisons of next gen games to last gen games. Everyone seems to think TLOU2 and RDR2 are still the best looking games but one comparison to hellblade 2 and Black Myth would easily show why its so goddamn expensive. Maybe I willll do it.
Can't wait to see your Digital Slimery channel :lollipop_grinning_sweat:
 
Last edited:

H . R . 2

Member
here is a report by NY TIMES from a few days ago which while only partially true, explains why the entire industry seem to have made a unanimous decision that visual fidelity should no longer be the priority because fewer and fewer gamers are appreciative of the technical aspects/visuals of games and more and more people are interested in online gaming which tends to prioritise the sense of community rather than the visuals

Joost van Dreunen, a market analyst and professor at New York University, said
it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”

When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) —
it creates challenges for studios that make blockbuster single-player titles.

The industry’s audience has slightly shrunk for the first time in decades.
Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.
Many video game developers built their careers during an era that glorified graphical fidelity.

They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist,
removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.
But a few years later, costly graphical upgrades are often barely noticeable.

Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense.
Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.

“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends.
“The developers aren’t chasing graphics but the social connections that players have built over time.”

Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies.

Rami Ismail, a game developer in the Netherlands recalled a question that emerged early in the coronavirus pandemic
and has become something of an unofficial motto in the video game industry.

“How can we as an industry make shorter games with worse graphics made with people who are paid well to work less?” Ismail said.

“If we can, then there might be short-term hope,” he continued. “Otherwise I think the slow strangulation of the games industry is ongoing.”

full report:
 
Last edited:
here is a report by NY TIMES from a few days ago which while only partially true, explains why the entire industry seem to have made a unanimous decision that visual fidelity should no longer be the priority because fewer and fewer gamers are appreciative of the technical aspects/visuals of games and more and more people are interested in online gaming which tends to prioritise the sense of community rather than the visuals

Joost van Dreunen, a market analyst and professor at New York University, said
it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”

When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) —
it creates challenges for studios that make blockbuster single-player titles.

The industry’s audience has slightly shrunk for the first time in decades.
Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.
Many video game developers built their careers during an era that glorified graphical fidelity.

They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist,
removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.
But a few years later, costly graphical upgrades are often barely noticeable.

Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense.
Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.

“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends.
“The developers aren’t chasing graphics but the social connections that players have built over time.”

Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies.

Rami Ismail, a game developer in the Netherlands recalled a question that emerged early in the coronavirus pandemic
and has become something of an unofficial motto in the video game industry.

“How can we as an industry make shorter games with worse graphics made with people who are paid well to work less?” Ismail said.

“If we can, then there might be short-term hope,” he continued. “Otherwise I think the slow strangulation of the games industry is ongoing.”

full report:
I would much rather developers focus on the game world than graphics. Give me towns and cities that are properly populated and feel lived in. Or, Give me 20 enemies on my screen with unique animations

That is my biggest concern with UE5. Yeah, it can produce great graphics, but is it good at processing living breathing worlds? I have yet to see a UE5 game accomplish that. This generation should be doing so much more from a gameplay perspective. It feels like many developers are just chasing high end graphics compared to what actually makes a game good
 
Last edited:

GymWolf

Member
I would much rather developers focus on the game world than graphics. Give me towns and cities that are properly populated and feel lived in. Or, Give me 20 enemies on my screen with unique animations

That is my biggest concern with UE5. Yeah, it can produce great graphics, but is it good at processing living breathing worlds? I have yet to see a UE5 game accomplish that. This generation should be doing so much more from a gameplay perspective. It feels like many developers are just chasing high end graphics compared to what actually makes a game good
I doubt that is an engine problem and more of a dev choice.
 

Msamy

Member
here is a report by NY TIMES from a few days ago which while only partially true, explains why the entire industry seem to have made a unanimous decision that visual fidelity should no longer be the priority because fewer and fewer gamers are appreciative of the technical aspects/visuals of games and more and more people are interested in online gaming which tends to prioritise the sense of community rather than the visuals

Joost van Dreunen, a market analyst and professor at New York University, said
it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”

When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) —
it creates challenges for studios that make blockbuster single-player titles.

The industry’s audience has slightly shrunk for the first time in decades.
Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.
Many video game developers built their careers during an era that glorified graphical fidelity.

They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist,
removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.
But a few years later, costly graphical upgrades are often barely noticeable.

Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense.
Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.

“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends.
“The developers aren’t chasing graphics but the social connections that players have built over time.”

Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies.

Rami Ismail, a game developer in the Netherlands recalled a question that emerged early in the coronavirus pandemic
and has become something of an unofficial motto in the video game industry.

“How can we as an industry make shorter games with worse graphics made with people who are paid well to work less?” Ismail said.

“If we can, then there might be short-term hope,” he continued. “Otherwise I think the slow strangulation of the games industry is ongoing.”

full report:
This report from NY is Just some bullshit,.
 
I value temportal stability over basically anything else, and when it comes to temporal stability there has been a clear step back from the best last gen games.

I honestly couldn't care less about realistic lighting, a pleasing final image is to me much more important.
Lumen especially is just DOGSHIT, like legit worse than Xbox 360 era lighting. what use is "realism" if this realism comes with a massive side dish of RT boiling, constant ghosting, constant like leakage, and very noticeable and slow accumulation?

I'd take prebaked lighting ala Mirror's Edge over literally any Lumen implementation on the market so far.
P56Qyvx.jpeg
y2QP0i7.jpeg
cAWpNaj.jpeg


and the prebaked GI and shadows in Mirror's Edge were massively limited by the memory sizes at the time. imagine this game targeting a console with 16GB of memory... the quality of the prebaked GI and shadows would destroy Lumen even more
Samuel L Jackson What GIF by Coming to America


You are comparing STATIC lighting (that´s missing a ton of AO and colour bounce btw) to dynamic lighting here.
That comparison makes no sense whatsoever unless you want your games completely static including any and all effects in it....
 

PeteBull

Member
I tried to replay bloodborne recently and somehow the 30 fps there feel worse than something like spiderman, the game doesn't even have locked 30...
Its not only 30fps, it has very uneven frametimes, thats the biggest problem, it makes huge difference when playing, especially with the kind of gameplay bloodborne requires :)
 

rofif

Can’t Git Gud
Its not only 30fps, it has very uneven frametimes, thats the biggest problem, it makes huge difference when playing, especially with the kind of gameplay bloodborne requires :)

The pros of that is that the game is very responsive and controls great. Way faster than most other 30fps games.
Compared to super stable 30fps mode in demons souls which is like playing through the cloud….

The frame pacing issues is way overblown for bloodborne. It’s not constant and never bothered me honestly.
 
Top Bottom