• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

alloush

Member
This is easily the dumbest post in the last two weeks. And given that, you know, *gestures to everything* that's really impressive.
People are getting too comfortable with the insults imma go ham on everyone now, been far too respectful.

I don’t need an opinion from a cunt mucus like yourself.

People can learn a thing or two from SlimySnake SlimySnake or ChiefDada ChiefDada on how to disagree without resorting to crap!
 

GymWolf

Member
Perfect, for me the problem with every game is that half the time shit is glowing when it should be casting a shadow and not getting lit.
So rtx fixes the biggest flaw of game grpahics. Nothing could be more worth it.
Good for you.

After trying the game maxed out on pc i can safely say that it looks WAY better in screenshot and on\off comparisons compared to live on my 4k oled.

I always had this impression after my first run with the game and after seeing other people screenshot in here.
I thought it was because i wasn't maxing out the game (i mean i did as an experiment but it was unplayable on a 2070super of course).

But now i tried the real deal and the game can't look better than literaliy maxed out in 4k, there is no trick or setting that i'm missing, i even tried dlaa for best picture quality and my suspects turned into a full realization.

Rdr2 maxed out on pc for me is way pleasant to look at, hell even dead island 2 looked not much worse tbh.
 
Last edited:

alloush

Member
Alloush is having a slimysnake moment.

And I love it.

Watch this gamespot review featuring footage from overdrive mode sent directly to them by cd project themselves. And tell me if this is the greatest looking game you've seen or not. It looked surprisingly mediocre to me. But then again, it is literally DLC of a 2020 game that I thought wasnt as good looking as TLOU2 on the whole and i played it in 2020 in rt mode.



I am Team Alloush. It takes balls to criticize a path tracing game that will likely win best graphics from everyone and their mothers. I love this YOLO energy. Even respect it.

It’s funny when I was arguing with all these nerds I remembered you and said “how does Slimy keep his composure” haha.

This is a game that looks so gamey and outdated and nothing about it is next gen and I will die on this hill. I don’t budge and dont give in to pressure or say things people like to hear which is why I highly respect someone like GymWolf GymWolf .

But I never disrespected anyone either, people like to resort to insults and name calling and what not, they cannot have a civilized convo. Only one who tried to convince me by having an actual civilized convo was Turk1993 Turk1993 .
 
Last edited:

Toots

Gold Member
Huh? Most games are iterative on gameplay and they don't stray to far from what was already there.
Let's agree to agree then i guess but i'll add some things are more iterative than others, to paraphrase Orwell.
Also iterative is quite the euphemism when we are talking in fact about a painfully unoriginal product with thrice reused assets.

Or you have been even broader with your statement.
Every human achievement is iterative and we don't too far from what was already there.
Newton standing on the shoulders of giants and all that crap.
 

Lethal01

Member
Rdr2 maxed out on pc for me is way pleasant to look at, hell even dead island 2 looked not much worse tbh.
There's also tons of game I think look better than it, but it's due to art design.
Like Tears of the kingdom in 4k, Guilty Gear and FF7 remake.

WOn't try to sway your opinion, just hoping Devs do what I want which is the opposite of what you want.
 
Last edited:

GymWolf

Member
There's also tons of game I think look better than it, but it's due to art design.

Like Tears of the kingdom in 4k, Guilty Gear and FF7 remake.
That small list gave me cancer, thanks:messenger_kissing_smiling:

(Except guilty gear, that one always looked fabolous)
 
Last edited:

LiquidMetal14

hide your water-based mammals
Starfield still amazes me with the amount of detail it has on PC.
JNb4cyj.jpg

YgUiRmO.jpg

98KPS44.jpg

j4gmplA.jpg

BWGNVm6.jpg

tYmz547.jpg
The problem with games like this is the lighting and everything else is so flat that it is beyond and consistent. When you point a flashlight at something or somebody you just look at the lack of good quality ambient occlusion and it loses a lot of the muster. It reminds me of something like looking at a game from 10 or 15 years ago where the lighting is flat. When you add modern shaders it just is a night and day difference.

Not that we need another reason to give Starfield more grief but in parts it looks very clean but the overall visual polish is nowhere near the elites. Only intrinsically because it's a newer generation game does it inherit benefits of it looking more modern. But it falls so flat and many regards compared to any competent looking game on a technical level that is good to admire it but it's laughable to compare it to anything that is what we consider a benchmark visually. Cyberpunk comes to mind when I say benchmark.
 
I'll have you know I've been going on about real time raytracing for over a decade now.

I know a few people who have been obsessed with ray-tracing for years now, but they usually come from a background of working in visual effects and artistry. Not sure if you're the same?

Beyond that, I never saw a widespread demand from the gaming audience specifically for real-time ray-tracing, but then again the audience doesn't know what it wants until you show it to them. I think a lot of demand did come from developers as well, as fully fledged RTGI can speed up the development process and also do a better job than baked lighting.

Mark Cerny even remarked how most developers didn't even ask for ray-tracing in the development of the PS5, however he attributes it to the developers thinking there wouldn't even be enough RT horsepower. I personally think it's because they didn't give a shit, I remember watching a few podcasts with game developers, and a lot of the features they usually want in a GPU are more RAM, and improved memory sub-systems like cache coherency. A good example was the VFX artist who worked at Infinity Ward, he mentioned that they tested ray-tracing systems in COD development but found little to none visual improvements so it was scrapped. Other developers also claimed their baked lighting system was good enough and they didn't see a need to implement RTGI when taking into account its performance cost.

I think ray-tracing matters, and it'll be good to see more games adopt RTGI but it's not a feature I'm crying out for, maybe it's the opposite for you since your eyes are a little more trained to spot lighting accuracy. There's also been dozens of discussions on Gaf furiously debating which games are using RT leading up their release, fueled by gameplay footage and lack of developer comments. For me that says enough about the "significant visual improvements" RT offers, which posters on here try to claim.
 

Lethal01

Member
I know a few people who have been obsessed with ray-tracing for years now, but they usually come from a background of working in visual effects and artistry. Not sure if you're the same?

Beyond that, I never saw a widespread demand from the gaming audience specifically for real-time ray-tracing, but then again the audience doesn't know what it wants until you show it to them. I think a lot of demand did come from developers as well, as fully fledged RTGI can speed up the development process and also do a better job than baked lighting.

Mark Cerny even remarked how most developers didn't even ask for ray-tracing in the development of the PS5, however he attributes it to the developers thinking there wouldn't even be enough RT horsepower. I personally think it's because they didn't give a shit, I remember watching a few podcasts with game developers, and a lot of the features they usually want in a GPU are more RAM, and improved memory sub-systems like cache coherency. A good example was the VFX artist who worked at Infinity Ward, he mentioned that they tested ray-tracing systems in COD development but found little to none visual improvements so it was scrapped. Other developers also claimed their baked lighting system was good enough and they didn't see a need to implement RTGI when taking into account its performance cost.

I think ray-tracing matters, and it'll be good to see more games adopt RTGI but it's not a feature I'm crying out for, maybe it's the opposite for you since your eyes are a little more trained to spot lighting accuracy. There's also been dozens of discussions on Gaf furiously debating which games are using RT leading up their release, fueled by gameplay footage and lack of developer comments. For me that says enough about the "significant visual improvements" RT offers, which posters on here try to claim.

Maybe you are right that devs didn't want raytracing, but they did want better dynamic lighting, and nothing is competing with raytracing when it comes to that.

The way I see it lighting is the main thing you can't get wrong if you wanna look realistic.
an object with low res textures can exist, an object with simple geometry can exist a solid object that cast no shadows and only reflect light to your eyes can't.

Even if your materials, textures and geometry are all wrong and your ingame car looks like it's made of clay or paper, with proper lighting it still looks like a real car made out of clay/paper. real lighting is all it takes to be photorealistic.
 
Last edited:

Lethal01

Member


For all I know this is the most graphically impressive game ever, but the darkness combined with youtube compression turns the video into a blurry mess so who knows.
 

GymWolf

Member
Maybe you are right that devs didn't want raytracing, but they did want better dynamic lighting, and nothing is competing with raytracing when it comes to that.

The way I see it lighting is the main thing you can't get wrong if you wanna look realistic.
an object with low res textures can exist, an object with simple geometry can exist a solid object that cast no shadows and only reflect light to your eyes can't.

Even if your materials, textures and geometry are all wrong and your ingame car looks like it's made of clay or paper, with proper lighting it still looks like a real car made out of clay/paper. real lighting is all it takes to be photorealistic.
A shit texture is gonna be shitty no matter the light source.

This is why old games with rtx like quake or minecraft still look nowhere near modern games, let alone photorealistic...

You can't enhance details when they are simply not there to begin with.

Hell i can still point out every bad texture in pathtraced cyberpunk with both my eyes tied behind my back.
 
Last edited:
A shit texture is gonna be shitty no matter the light source.

This is way old games with rtx like quake or minecraft still look nowhere near modern games, let alone photorealistic...

You can't enhance details when they are simply not there to begin with.

Hell i can still point out every bad texture in pathtraced cyberpunk with both my eyes tied behind my back.

I was literally going to type the same thing lol if we go by such standards, Quake 2 and Portal RTX are the most photo-realistic games ever, but that's simply not true.

Photo-realism has always been a combination of micro-polygon details, high resolution textures and dynamic multi-bounce lighting. I don't think it's a case of one over the other.
 

GymWolf

Member
Its shocking to me that SM2 isn't being called out for being the exact same game as SM1 & Miles.
I loved playing SM1 and Miles, replayed it often. But I don't want to play to same game again.
It even has the exact same scripted chase scenes.

Timestamp 2:55


Its astonishing. It will even have the same scripted to the T side missions. People wonder how Insomniac is able to release so many games in a short amount of time. its simple. ITS THE SAME GAME.

It will still come with the same 100% scripted side mission where you go on a police chase that is 10000% scripted and completely on rails. Where you jump on the roof of the car, one guy shoots at you, you web him then the driver shoots at you, you web him and then you jump back, web the car and chase over.

Same thing over and over and over again. There's virtually NO AI in this game. Absolutely none. The closet thing to AI is the system that picks how many enemy can attack you at once and when they can attack you and with what intensity.

If any other open world game pulled something like this, they would be crucified.

At the very least put AI in the game. The side mission police chase should be an actual chase between police AI and NPC AI in which you show up to save civilians from being killed and stop the NPC AI. No two police chase would be alike. But no, 1000% scripted. Even the chases in main missions are 100% scripted. The entire city and its NPC are literally just wall-paper. Quite the contrast from every other open world game (GTA, watchdogs, etc)

I and others expected something like this:




Do 5% of what other open world games do. Maybe people can overlook the fact the graphics just looks incrementally better.

I almost want to kiss you...almost.
 

CGNoire

Member
Nah, but apparently keeping it real on these forums triggers people. But just found out, CP is actually the red line for people in this thread.
Some people care alot more about nuance lighting than others. In all fairness CP's assets and animations where outdated on release and they have done a really good job matching the artistic look of the OD mode where the value balance is super similar now keeping the aesthetic alot closer.

For me as a painter even the subtle differences between the 2 scream out at me.. i fucking hate light bleeding with a passion and a constant directionality of lighting is a requirement for me going forward. Too bad about the assets though :/.
 
Last edited:

Lethal01

Member
A shit texture is gonna be shitty no matter the light source.

This is why old games with rtx like quake or minecraft still look nowhere near modern games, let alone photorealistic...

You can't enhance details when they are simply not there to begin with.

Hell i can still point out every bad texture in pathtraced cyberpunk with both my eyes tied behind my back.
I was literally going to type the same thing lol if we go by such standards, Quake 2 and Portal RTX are the most photo-realistic games ever, but that's simply not true.

Photo-realism has always been a combination of micro-polygon details, high resolution textures and dynamic multi-bounce lighting. I don't think it's a case of one over the other.

minecraft-1-650x487.png

Simple shapes and simply colors can exist in real life and be photorealistic,
Incorrect lighting is physically impossible, it just doesn't happen.

Photorealism is lighting, lighting is photorealism.
 
Last edited:

GymWolf

Member
Lo
minecraft-1-650x487.png

Simple shapes and simply colors can exist in real life and be photorealistic,
Incorrect lighting is physically impossible, it just doesn't happen.

Photorealism is lighting, lighting is photorealism.
You are picking an example of objects in real life that try to depict shit videogame graphic.


If people and normal objects had shit textures in real life your theory could work, but in realistic games we try to recreate normal stuff, not minecraft horrible artstyle.

This is how raytraced minecraft look


It's dogshit.
 

Hunnybun

Banned
some of the best games ever dropped frames to 20 fps. MGS1, GTA San Andreas, Shadow of the Collosus, even MGS4 had some big drops during the chase sequences.

Back in the day we just understood that the devs are pushing the system hard and accepted the lower framerate as a compromise. now we are all like Give me 60 fps or Give me Death. And devs comply by giving us 60 fps and last gen trash. Digital Foundry really fucked everything up, didnt they?

So that's what you want, back to 20fps as standard like Ocarina of Time?

After all: just imagine the fidelity!

Things have progressed is all. Last gen finally got us locked 30fps, and now we're getting the OPTION of 60fps (generally at unacceptably low resolutions, remember, so it's not as if the quality modes are running at 4k or anything).

People like you who value the quality of graphics as seen in screenshots have had their way for about 30 years. Do you really begrudge a bit of compromise with those of us who value how a game looks in motion?
 

SlimySnake

Flashless at the Golden Globes
So tldr performance in games is below 4x, and theoreticaly performance below 5x even tho they mention their test is inacurate, so not 8x cpu upgrade like u claimed, ty kindly xD
Performance in last gen games that arent designed to scale up is not an efficient way to measure performance. Thats why we use synthetic benchmarks like cinebench. Thats pretty much how all CPUs are measured because games are so poorly optimized around CPU strengths since last gen.

It's actually fairly easy to come up with the 8x figure for theoretical performance.

  • PS4 had 8 cores, PS5 has 8 cores and 16 threads. So right off the bat, you have 2x more power
  • PS4 ran at 1.6 GHz, PS5 runs at 3.5 GHz, one again 2x more power. Combined, we are up to 4x.
  • Jaguar to Zen 2 IPC gains were roughly 2x. Thats around 8x.

This chart actually shows this rather well. And Richard was using the pre-released 3.2 GHz clocks, not the 9% faster clocks PS5 ended up shipping with.

Increase the Zen 2 858 results by 9% and you get a 7.3x faster CPU compared to his Jaguar CPU.

TQnqLzz.jpg


This is very similar to the calculations we did for the GPU to determine just how powerful the PS5 GPU was.
  • PS4 had 18 CUs. PS5 36. Thats 2x.
  • PS4 ran at 0.8 Ghz, PS5 2.23 Ghz. Thats 2.7x. Thats 5.5x. Your raw tflops.
  • IPC Gains from GCN to Polaris to RDNA were 1.5x. That gets us to 8.3x. Or roughly 15.3 tflops of theoretical performance.

Not all games see an 8x performance increase over base PS4 versions because of various bottlenecks. But some like Death Stranding and Uncharted 4 show an 8x increase in pixel budget as they both have unlocked native 4k 60 fps mode that hover around 50 frames. Getting us that 7-8x increase in pixel per frame. A lot of PS4 games like Horizon and GOW are heavily CPU bottlenecked on PC. I remember being able to run HZD at native 4k ultra settings at 60 fps, but even reducing the resolution by 4x to 1080p medium console settings didnt let me go past 105 fps. MGSV also topped out at 105 fps. Apparently my cpu which was much faster than jaguar CPUs was the bottleneck because the game was single threaded and wanted even higher clocks.

Devs have A LOT of power available to them. If they care to utilize it. Just like Nanite and Lumen, devs have to redesign their engines to take advantage of these GPUs and CPUs. Go all in on multithreading and getting slightly better AI should not be too hard to do.

EDIT: Just read that the FF7 Rebirth party members dont just sit on the sidelines like that digitial trends preview says lol. They do help. You just cant control them. Im ok with that. That is more of a gameplay balance thing. I had an issue with them standing on the sides waiting to be subbed in lmao.
 
Last edited:

winjer

Member
Cinebench performance does not translate well into gaming performance.
CB uses a little amount of cache, mostly L2. meanwhile games use a lot more cache, especially for branching.
The other issue is that CB has a very linear calculation set. So it does not use the front end of the CPU, as heavily as games.
So those scores will not show the performance benefits of the PS5 having an L3 cache, nor that of having a better OoO.
 

SlimySnake

Flashless at the Golden Globes
Cinebench performance does not translate well into gaming performance.
CB uses a little amount of cache, mostly L2. meanwhile games use a lot more cache, especially for branching.
The other issue is that CB has a very linear calculation set. So it does not use the front end of the CPU, as heavily as games.
So those scores will not show the performance benefits of the PS5 having an L3 cache, nor that of having a better OoO.
Yeah, RIchard mentioned that in his article. He actually used that test precisely because the PS5 CPU cache is not the same as the CPU cache of the Zen 2 CPU he was using to test. So it is actually a fairly accurate way to test the raw CPU power increase.

Obviously in game performance no longer scales with CPUs especially poorly multithreaded games. We were talking about this in the starfield thread just a few weeks ago. But the power is there. At least for better AI and other CPU heavy tasks like physics and destruction.
 

PeteBull

Member
Syntetics are all theory, real games dont use all cores equally, its not workload in cinerbench, we talking game performance, and thats 3x, not 8x, tlou part1 runs in 1440p60 withdips to 50, ultra performance GoW:R dips below 80fps on ps5 while ps4 version holds stable 30, so actual real ingame cpu performance is 3x, not 8x like in theory/in cinnerbench multicore test.
Just compare c23 tests of different cpu's and their actual performance in games, even in best case scenario when game has really good multicore code like cp2077 theoretical 8x c23 difference is never that much in real game.

There is not a single example in any game when actual cpu from ps5 or xsx is more than 3x performance of last gen consoles, and thats hard fact.

Again we dont have to look far, ms first party, bethesda and its new child, starfield runs terribly on xsx in cpu bottlenecked areas dips to 25fps and 1sec pauses
If xsx cpu was actually 8x powerful we wouldnt have those there, its not multiplat dev who works on 5 different versions of the game, bethesda dev team is huge and only needed to work on pc version(which is very unoptimised and u can tell it got very lil love, hence lack of dlss even), and xss/xsx, yet in cpu demanding scenarios we still run into visible bottleneck.

Here is vid where rich from DF makes "frankenstein console" aka cpu same as in xsx and gpu very similar to ps5 one, and even that can play the game better than xsx


Another MS first party title, halo infinite, runs on last gen xbox one x perf mode 60fps with dips, so u would think xsx 120fps mode is rock solid, but nopes- dips below 80fps occur in battles/while driving despite really heavy dynamic res scaling(960p fuckin lowest)

I keep telling u guys those current gen consoles are midrange pc's hardware wise, and u somehow ignoring the hard facts.
Just use bit of logic, if current gen consoles cpu was actually 8x faster vs last gen cpu we wouldnt have dips below 80fps in 120fps modes in cpu bottlenecked scenarios.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It's not a pipeline benefit is a game changer to design and fidelity. If you don't see that is because you not understand what you are talking about. And please don't take my words as a personal attack. Not every body needs to Know these things.

It's better, not a significant setp above.

You clearly don't know what you're talking about. Even with an army of talented artists adding ambient lights, hero lights... You couldn't light everything properly to look good or realistically accurate.

There's no need to check a box, it's a new way of doing things that weren't possible before.

4A games web taking about RT

I can't show you exactly why I know because I don't want to put myself or others in trouble. But I will say that if you have some knowledge of systems architecture, memory management and see devs presentations you could easily get enough info to reach the reason why i said that Ps5 streaming capabilities are not fully used jet.

One tidbit. Same engine, same game:



As you can see a 3.2 GB/s external SSD with no custom controller could move the game on gameplay the same as the 5.5 GB/s internal SSD using the dedicated controller.

Anyway i'm sure if i put you in front of the reason you won't believe me.

My thoughts on RTGI might come across as a bit schizo, but here me out.

On one hand, games like TLOU2 and Red Dead 2 have fantastic lighting that I find way more pleasant to look at then say RTGI lighting in games like Cyberpunk, Metro Exodus and Star Wars ovreall. But thanks to RTGI they look far more consistent than TLOU2 which can go from looking like this

FDJaKA4XIAkZq2p

to this
Fzrb5_HWYAIyCdi


and this

E19rvf5X0AI72eJ


You can see shadows completely missing from the last two shots so some kind of RTGI or RTAO/RT Shadows solution wouldve helped there. Spiderman 2 will definitely have areas that would benefit from a more dynamic lighting solution to keep it consistent through out its open world.

A good example of this is the Witcher 3 next gen patch which completely changes the foliage as you enable RT. I dont know if its RTGI or RTAO or RT Shadows that are making this massive change to foliage but ive seen this in cyberpunk's plants as well.

F41BalWWgAAQBcL

F41BalTWoAAheH1


Now comes the shcizo part.

Problem with RTGI being slapped on a last gen lighting solution is that it only makes it slightly more accurate in 99% of the cases as we can see in Cyberpunk's latest footage. I want the devs to improve their base lighting model first and if software GI like software Lumen is the answer then great. However, I have yet to see games with lighting like the Matrix and the first PS5 demo despite several games shipping on UE5 so clearly the engine that enables GI isnt enough. Devs and their artists need to go in and create the right color tones instead of just hoping RTGI will light up an environment correctly. So in that case, i would first like insomniac to improve their base lighting model and then they can worry about whether or not their engine can handle RTGI. Get us close to that 2021 reveal trailer's lighting.
 

Robbinhood

Banned
You seriously don't understand why it's not a fair comparison? LOL
LOL indeed. We are comparing the next gen upgrade. You don’t think that requires more processing power to render those distant objects? In addition its playable. Like the game cant win to some of you. What a silly stance.
 

MidGenRefresh

*Refreshes biennially
LOL indeed. We are comparing the next gen upgrade. You don’t think that requires more processing power to render those distant objects? In addition its playable. Like the game cant win to some of you. What a silly stance.

I think that the game looks good, I'm getting it day uno. But this is simply not a fair comparison. Maybe someone with more patience will explain to you why, I don't have energy for it. Plus 2.0 update for Cyberpunk just dropped so bye.
 

SlimySnake

Flashless at the Golden Globes
Syntetics are all theory, real games dont use all cores equally, its not workload in cinerbench, we talking game performance, and thats 3x, not 8x, tlou part1 runs in 1440p60 withdips to 50, ultra performance GoW:R dips below 80fps on ps5 while ps4 version holds stable 30, so actual real ingame cpu performance is 3x, not 8x like in theory/in cinnerbench multicore test.
Just compare c23 tests of different cpu's and their actual performance in games, even in best case scenario when game has really good multicore code like cp2077 theoretical 8x c23 difference is never that much in real game.

There is not a single example in any game when actual cpu from ps5 or xsx is more than 3x performance of last gen consoles, and thats hard fact.

Again we dont have to look far, ms first party, bethesda and its new child, starfield runs terribly on xsx in cpu bottlenecked areas dips to 25fps and 1sec pauses
If xsx cpu was actually 8x powerful we wouldnt have those there, its not multiplat dev who works on 5 different versions of the game, bethesda dev team is huge and only needed to work on pc version(which is very unoptimised and u can tell it got very lil love, hence lack of dlss even), and xss/xsx, yet in cpu demanding scenarios we still run into visible bottleneck.

Here is vid where rich from DF makes "frankenstein console" aka cpu same as in xsx and gpu very similar to ps5 one, and even that can play the game better than xsx


Another MS first party title, halo infinite, runs on last gen xbox one x perf mode 60fps with dips, so u would think xsx 120fps mode is rock solid, but nopes- dips below 80fps occur in battles/while driving despite really heavy dynamic res scaling(960p fuckin lowest)

I keep telling u guys those current gen consoles are midrange pc's hardware wise, and u somehow ignoring the hard facts.
Just use bit of logic, if current gen consoles cpu was actually 8x faster vs last gen cpu we wouldnt have dips below 80fps in 120fps modes in cpu bottlenecked scenarios.

I dont disagree that the actual CPU performance isnt 8x. But the theoretical power is definitely there. The problem with most of your last comparisons is that they are only taking into account the single threaded performance. And as i laid about above, its roughly 2x due to its higher clocks and 4x after IPC gains. These games are not utilizing extra threads on PC CPUs. Ive seen it time and time again. Some CPU threads literally sit there under 10% utilization in A LOT Of last gen games because they were designed around single thread 8 core jaguar CPUs. Starfield and Cyberpunk are maybe the only games that scale well with higher cores and starfield actually stops scaling after 6 cores. And apparently runs better if you turn off hyper threading so not really a good example of utilizing these 2x more threads in next gen CPUs. So basically thats why you are seeing the 3-4x better performance right now.

That said, I would love to see starfield running on that Athlon jaguar CPU Richard used in his benchmarks.

P.S Halo Infinite ran like shit on my CPU which goes up to 5.0 Ghz. so roughly 40% more powerful than the XSX CPU. Its just a very poorly optimized game at higher framerates just like Horizon 1 and GOW.
 

Robbinhood

Banned
I have a better comparison that shows how advanced Spiderman 2 is compared to Spiderman 1 on PS4.

Venom PS5.
spiderman-2-071823-1.jpg



Venom PS4:
2560px-Very_Black_screen.jpg


Generational Difference.
God awful response.

Both are rendering the same city and surrounding area. The next gen game has a way further LOD. How is that not valid.

Wanna talk about shit points, you mentioned "At least TOTK has an entire underground area now and sky while SP2 is the same city".

TOTKs underground is literally a cave with mud textures and a few structures here and there. Meanwhile SP2 doubled its map size AND updated its assets for the original map significantly, which TOTK did not do. Y'all the definition of disingenuous.
 

Zuzu

Member
Yeah I don’t reckon Spiderman 2 looks that impressive for a sequel on new console hardware. It looks like 35% better maybe. The character models still look like PS4 generation. I like the look of the ray traced reflections on the water though. That’s really nice.
 
Top Bottom