• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

CamHostage

Member

Those veins look just like a normal map.
I assumed the muscle deformation was ML/GPU based as we've seen in other recent games and UE demos, but ultimately idk.

No, I think Ocasm is right that this is not muscle deformation, it's the same body position with a texture or model swap for the veins to pop out. See that it's qthe same muscle and muscle movement, making a bicep the same size, just now with veins yolking out.

Muscle Deformation is about simulating the muscle groups underneath and then applying stretch of skin and adjustments of body motion in order to move as the body moves. Muscles get bigger as they are activated, they pull the skin across muscles and joints, different muscle groups engage when the body extends, etc. Veins can be part of the visible muscle mass, but I wouldn't say they're the "deformation" involved in a ML trainer aside from general inclusion of veins in the body model (but then the game designers are animating the veins at like 500% anyway to look cool and buff so what the machine learns gets blown out even if that data is used.) It's not about "deforming" the muscles already there to look more buff, it's about deforming all the muscles involved in a movement with stretch and flex to move correctly.

63438006e420c.gif


They already have done vein-popping effects in previous SF and Tekken and MK releases, I believe? SF6 may have more topography of what's added to the model to make the veins pop, but I'm not seeing a difference in execution method. It's better than SF5, but I think it's a similar or adjacent technique.

zangief-street-fighter.gif


It's highly likely this is muscle deformation being performed on the GPU, given the Series S doesn't have a powerful GPU that's the end result unfortunately.

Maybe, but the ML of muscle deformation is done before its put into the game; a machine is trained on the body to move it as muscles should. (Which is interesting on these crazy SF character models who are built to look awesome first and foremost, realistic somewhere down the priority line, but with enough work, you can assign muscle structure to even the ridiculous muscle groups of the World Warriors.) You do need high-end hardware to then go through that massive data set of moving muscles and apply the proper evaluation as the game runs, but deformer performance in UE is supposedly "relatively lightweight in both memory usage and performance". (I think even God of War 2018 had muscle deformation system experimentations, though it's not the same level of rich data simulation.)



If it is actually ML, then it might be as you say just a bit too much on the Series S hardware for the framerate that Capcom needs to hit in a fighting game.
If it's an animated texture or a layer of veins/muscles swapping over the body skin, then it may be a matter of memory that they just have to shirk one FX layer to keep the game in the parameters.

It's a cutback either way, but one would be because the machine couldn't do it, the other because the machine worked better not doing it. And we'll see in future games, but my assumption is that we will see ML muscles on Series S games...
 
Last edited:
No, I think Chief is right that this is not muscle deformation, it's the same body position with a texture or model swap for the veins to pop out. It's the same muscle and muscle movement, making a bicep the same size.

Muscle Deformation is about simulating the muscle groups underneath and then applying stretch of skin and adjustments of body motion in order to move as the body moves. Muscles get bigger as they are activated, they pull the skin across muscles and joints, different muscle groups engage when the body extends, etc. Veins can be part of the visible muscle mass, but I wouldn't say they're the "deformation" involved in a ML trainer aside from general inclusion of veins in the body model (but then the game designers are animating the veins at like 500% anyway to look cool and buff so what the machine learns gets blown out even if that data is used.)

They already have done vein-popping effects in previous SF and Tekken and MK releases, I believe? This may have more topography of what's added to the model to make the veins pop, but I'm not seeing a difference in execution method. It's better than SF5, but I think it's the same technique.

zangief-street-fighter.gif




Maybe, but the ML of muscle deformation is done before its put into the game; a machine is trained on the body to move it as muscles should. (Which is interesting on these crazy SF character models who are built to look awesome first and foremost, realistic somewhere down the priority line, but with enough work, you can assign muscle structure to even the ridiculous muscle groups of the World Warriors.) You do need high-end hardware to then go through that massive data set of moving muscles and apply the proper evaluation as the game runs, but deformer performance in UE is supposedly "relatively lightweight in both memory usage and performance".



If it is actually ML, then it might be as you say just a bit too much for the framerate that Capcom needs to hit in a fighting game; if it's an animated texture or a layer of veins/muscles pulling up in the body skin, then it may be a matter of memory that they just have to shirk one FX layer to keep the game in the parameters.

Interesting, looks good mind however they've achieved it.
 

ChiefDada

Gold Member
Only downplay features that are poorly used relative to their cost, like RT reflections in SM2

What is the cost of RT reflections on CPU, GPU, and memory/memory bandwidth? And if you're crazy enough to answer a question that I know you couldn't possibly have the answer for, please follow up with another feature that you would implement instead that is equal to or less than the cost of said RT reflections.
 

OCASM

Banned
What is the cost of RT reflections on CPU, GPU, and memory/memory bandwidth? And if you're crazy enough to answer a question that I know you couldn't possibly have the answer for, please follow up with another feature that you would implement instead that is equal to or less than the cost of said RT reflections.
Are you saying RT reflections are cheap?

In terms of what I'd replace them with I'd start with variable penumbra shadows. RDR2 proved they're affordable even on a base Xbox One. Would help with the cinematic look since area lighting/soft shadowing is one of the cornerstones of Hollywood lighting and the lack of them is a big reason why games still look "gamey."
 
Early in the demo it still had enclosed space movement, so well known from ps4/xbox one gen, and so hated too, coz u know its literally forced slowndown on ur moving pace, so next area can load itself in.
I doubt the demo had to load anything, it was probably done to show the detail of the nanite textures…
 
I disagree but yeah the 11th will show more.

It's crazy that Sony of all publishers jusr had their showcase, their chance to show people why they bought a ps5, and absolutely blew it. Now we're pinning our hopes on Microsoft. I agree that at this point, I have more hope for Forza Motorsport and Hellblade 2 (I don't think Starfield is going to look impress) than anything in Sony's pipeline for the next 2 years.

But that fucking sucks. Jim Ryan and Herman can kiss my ass. We have to wait until what, the very end of the generation for Naughty Dogs, Sucker Punch, Guerilla, and Santa Monica's next games? Aside from Spiderman 2 there are no Sony, single player graphically-intensive games scheduled until when, 2025 perhaps? It's pretty much all GaaS stuff.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It's crazy that Sony of all publishers jusr had their showcase, their chance to show people why they bought a ps5, and absolutely blew it. Now we're pinning our hopes on Microsoft. I agree that at this point, I have more hope for Forza Motorsport and Hellblade 2 (I don't think Starfield is going to look impress) than anything in Sony's pipeline for the next 2 years.

But that fucking sucks. Jim Ryan and Herman can kiss my ass. We have to wait until what, the very end of the generation for Naughty Dogs, Sucker Punch, Guerilla, and Santa Monica's next games? Aside from Spiderman 2 there are no Sony, single player graphically-intensive games scheduled until when, 2025 perhaps? It's pretty much all GaaS stuff.
2025 is out of the picture seeing as how those games wouldve been revealed 2-3 years in advance like horizon fw, gow ragnorak, gt7, ff16, ff7 rebirth, forspoken, spiderman and wolverine. So you're looking at 2026-2028. The worst part is that its a total of four games. TLOU3, Wolverine, Ghost of Tsushima 2 and Whatever SSM is cooking up. The rest are all GaaS games that will look last gen. So 4 games from the cream of the crop for us to look forward to for the remainder of the gen. There is no secret second team. No new IP in development that would release in the next 6 months. Just four games, one of which is insomniac who have regressed to embarrassing levels and should have their sony first party card revoked.
 

SlimySnake

Flashless at the Golden Globes
What is the cost of RT reflections on CPU, GPU, and memory/memory bandwidth? And if you're crazy enough to answer a question that I know you couldn't possibly have the answer for, please follow up with another feature that you would implement instead that is equal to or less than the cost of said RT reflections.
Some of us game on PC and we know exactly how much RT reflections cost. The answer is anywhere from 15-35% hit on GPU. Shit AMD sponsored RT like in RE4 is 15-20% while good nvidia sponsored RT like in cyberpunk, BFV and Control is up to 35%.

RE4 recently had a huge issue on PCs with vram usage jumping up by 1.5-2GB as soon as you would turn on RT causes a lot of 8-10 GB GPUs to straight up crash. Forcing most people to turn off RT on the most popular 30 series cards.

The CPU hit is also very noticeable, though i dont quite know the numbers off the top of my head, but it's well known among PC gamers, youtubers, and techies that RT has a big CPU cost. Again, a recent game like Star Wars has a pretty significant CPU hit as soon as you turn on RT.

For Spiderman on PC, Nixxes told DF why their game was so CPU bound with RT on and it was because the BvH structure that they need to build for RT is very expensive on the CPU. Alex did a test on the AMD 3600 CPU which is more in line with PS5 CPU and found that the game was so CPU bound that a more powerful CPU gave him 2x more framerates. Thats a 100% performance impact due to the CPU alone. Something unheard of in games that have been GPU bound since last gen. Hell, forget RT on vs off, just RT high vs ultra high was a massive 50% hit.

You dont have to be crazy enough to answer questions like this. Just watch any PC benchmark video or play any PC game with RT like we have been since 2018. You can start with DF's spiderman PC coverage that shows just how massive a hit RT can be on both the GPU and the CPU.

RT just like any other setting is something devs need to start thinking long and hard about. They already make compromises on AF, AA, shadows, draw distance, NPC counts, and foliage density. Those are settings that can have maybe a 1% hit (AF) to a 5-10% hit when going from medium to high. And yet they always settle for medium or high, and leave the ultra settings for PC. Yet when it comes to RT, they are willing to take massive 20-50% hit on GPU performance, big increase in precious VRAM and CPU allocations, and honestly had they spent those GPU resources on improving lighting, visual effects and character models, it wouldve been much easier to notice than reflections that can easily be faked with SSR.
 

CGNoire

Member
Are you saying RT reflections are cheap?

In terms of what I'd replace them with I'd start with variable penumbra shadows. RDR2 proved they're affordable even on a base Xbox One. Would help with the cinematic look since area lighting/soft shadowing is one of the cornerstones of Hollywood lighting and the lack of them is a big reason why games still look "gamey."
One of my favorite things about Remedy since QB has been its focus on soft shadowing exspecially noticable in there cutscenes with there focus on film noir techniques.
 

Edder1

Member
Ok so you agree with me that ray tracing is a current gen feature. Great. Now we get back to that pesky question of what defines a current gen game. Because you and everyone else here are quick to list games that are not current gen, without offering a definition - even if it's your own - for the nebulous term.
No, I don't agree with you that ray tracing makes a game current gen. You can apply full ray tracing (path tracing) to very old games and it wouldn't make them "next gen". But yes, it's a current gen feature that can be applied to non current gen games.

Some of the things that make up a "next gen" game are level of geometry that isn't possible on last gen, level of detail and density that wasnt possible before (see Matrix demo), material shading and physically based rendering taken to new level not seen before, volumetric fog, lighting and clouds on a new level, physics that weren't possible before, etc.
 
Last edited:

PeteBull

Member
Guys come on, lets go back on earth, guy posting Avatar cinematics/target render as a -what he expects games to look like(btw that bullshot trailer happend 2 years ago, at that time strognest pc gpu that was avaiable was rtx 3090, which is roughly 2x stronger from ps5/xsx gpus, if counting raytracing then probably close to 3x.

So basically we have 2 options- ubisoft like always presented fake trailer/target render, cinematics, aka in layman terms not actual gameplay trailer, or very very unlikely 2nd option- few years before game launch they had already working build of the game at stable 30fps running on 2x stronger gpu that is in current gen consoles that looked gorgeous af :)

Just a reminder, thats how this trailer looks like
We dont have there "running on ps5" or even alhpa footage or even in engine footage, nothing, that should give u enough of info of how real it is.

And remember guys, we talking ubisoft here, they are famous for showing bullshot trailers few years before actual launch of the game, its not even downgrade as such coz to downgrade,there had to be version looking better that gets downgraded, its just full on cinematic, not real game, nor even in engine footage.

We all love groundbreaking graphics here, but i assume most of us are veteran gamers who got at least 2 console gens(more likely even 3 or 4) already under our belts, we can roughly tell what is real if we use our brain and think logically- teaser trailer that is shown 3-4 or even more years before game launch has nothing to do with actual gameplay graphic quality.

Dunno if such saying is there in english, but in polish we say- u cant lure old bear using fake honey, lets act like old bears, not bear cubs plz :)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Guys come on, lets go back on earth, guy posting Avatar cinematics/target render as a -what he expects games to look like(btw that bullshot trailer happend 2 years ago, at that time strognest pc gpu that was avaiable was rtx 3090, which is roughly 2x stronger from ps5/xsx gpus, if counting raytracing then probably close to 3x.

So basically we have 2 options- ubisoft like always presented fake trailer/target render, cinematics, aka in layman terms not actual gameplay trailer, or very very unlikely 2nd option- few years before game launch they had already working build of the game at stable 30fps running on 2x stronger gpu that is in current gen consoles that looked gorgeous af :)

Just a reminder, thats how this trailer looks like
We dont have there "running on ps5" or even alhpa footage or even in engine footage, nothing, that should give u enough of info of how real it is.

And remember guys, we talking ubisoft here, they are famous for showing bullshot trailers few years before actual launch of the game, its not even downgrade as such coz to downgrade,there had to be version looking better that gets downgraded, its just full on cinematic, not real game, nor even in engine footage.

We all love groundbreaking graphics here, but i assume most of us are veteran gamers who got at least 2 console gens(more likely even 3 or 4) already under our belts, we can roughly tell what is real if we use our brain and think logically- teaser trailer that is shown 3-4 or even more years before game launch has nothing to do with actual gameplay graphic quality.

Dunno if such saying is there in english, but in polish we say- u cant lure old bear using fake honey, lets act like old bears, not bear cubs plz :)

I think everyone here expects that game to be downgraded. We are all aware of Ubisoft’s downgrades.
 

ProtoByte

Weeb Underling
2025 is out of the picture seeing as how those games wouldve been revealed 2-3 years in advance like horizon fw, gow ragnorak, gt7, ff16, ff7 rebirth, forspoken, spiderman and wolverine. So you're looking at 2026-2028. The worst part is that its a total of four games. TLOU3, Wolverine, Ghost of Tsushima 2 and Whatever SSM is cooking up. The rest are all GaaS games that will look last gen. So 4 games from the cream of the crop for us to look forward to for the remainder of the gen. There is no secret second team. No new IP in development that would release in the next 6 months.
Don't forget Horizon 3. Lol
But hey, people wanted more multiplayer out of PlayStation. Be careful what you wish for I guess.

Just four games, one of which is insomniac who have regressed to embarrassing levels and should have their sony first party card revoked.
That's not fair. They've just not advanced that far with SM2. If you want games quicker out of a studio, they're not going to be pushing as far as someone like you or me wants.

Hopefully their multiplayer game gets canned and they learn enough from Wolverine to hunker down on a Spider-Man 3 or some shit for next gen. But the message will have to be clear for Spider-Man 2: "Good, but not enough of an upgrade". Not just in visuals either.
 
Last edited:

winjer

Gold Member
What is the cost of RT reflections on CPU, GPU, and memory/memory bandwidth? And if you're crazy enough to answer a question that I know you couldn't possibly have the answer for, please follow up with another feature that you would implement instead that is equal to or less than the cost of said RT reflections.

On the CPU it depends on how the RT is done. Most games do the BVH building and traversal on the GPU. But games like Spider-Man do it on the CPU.
But then there is the matter of what RT effects are being rendered. If it's reflections, then it is necessary to render not only what is in the player field of view, but also what is in the reflection field of view. This means rendering more of the world, so more draw calls for the CPU to calculate.
But if it's just Global Illumination, and the BVH is done on the GPU, then it will have little impact on the CPU.
If it's just shadows, it might even reduce load on the CPU. Traditional shadows are calculated by using a virtual camera on the position of the source light and then using the depth buffer from it, to sample which pixels are in shadow. Usually this increases geometry use by a lot.

In terms of memory bandwidth, it has little impact. Most of RT work is done on the L2 cache in RDNA2 GPUs.
But RT is very dependent on the register size, as to keep work waves in flight. One o the things RDNA3 did to improve performance, was to increase register size per WGP.
Be it RDNA2, RDNA3, Ampere, Turing or Ada Lovelace, shader occupancy while using RT or Patch tracing drop drastically.
For example, on CP2077, using path Tracing, even Ada Lovelace only manages to use ~40% of instructions throughput. Although register allocation is close to 100%.
There is also the problem with conditional branches. GPU's are very bad at this type of things, and this results in low GPU usage.
 
Last edited:

PeteBull

Member
I think everyone here expects that game to be downgraded. We are all aware of Ubisoft’s downgrades.
If yes then we gotta be real with ourselfs, cinematic trailer shouldnt be benchmark of what is expected from current gen consoles.
Did we expect ps1 games to look like soul blade/edge intro? Ofc not, not even close.

Nowadays the difference between cgi and gameplay/in engine cutscenes isnt as big as it was 25 years ago but its still clearly visible.

Im confident even casuals or non gamers can clearly see difference for example between witcher3 cgi trailer or
vs bullshot trailer that pretends to be gameplay like this famous beauty or actual real gameplay trailer, that looks visibly worse

I used witcher 3 just so we can dismiss doubts about quality of dev studio, and since its independend game hopefully no need for console warring either, on top game came out may 2015 so plenty of time to reflect on its actual graphics/cinemtatics/cgi, and it even had next gen update on current gen consoles+ PC version got even bigger upgrade with all kinds of RT and mods put directly into it, which looks like this

To quickly sum it up, even next gen version on pc, maxed, on a mashine that currently costs close to 3k usd, still looks bit worse (maybe not overall but in some places for sure)from bullshot trailer from dec 2013 so 1,5year before games launch.
Obviously no point comparing actual game to cgi trailers at all coz there is still colorado canion of gap between 2013 CGI and 2nd half of 2022 launched definitive edition maxed.
 

Hot5pur

Member
This gen has taught me to stop caring about graphics. There are PS4 games that look as good as current gen titles. Problem is many games are going open world and that also makes it difficult to make great graphics.
I don't even know what next gen graphics will look like, nothing I've seen so far has really impressed.
Best way to increase immersion at this point is more fluid animation and realistic reactions to environment and physics. Perhaps we will be wowed at this upcoming Xbox event but I can't see it, there are already games that max out these consoles at 4K30 and they look good but not "next gen".
One area of innovation I've seen is the RE engine. The graphics are not that great for environments, but for character models they are quite impressive, especially in DMC5 and RE games. Really brings the characters to life.
 

SlimySnake

Flashless at the Golden Globes
To quickly sum it up, even next gen version on pc, maxed, on a mashine that currently costs close to 3k usd, still looks bit worse (maybe not overall but in some places for sure)from bullshot trailer from dec 2013 so 1,5year before games launch.
Obviously no point comparing actual game to cgi trailers at all coz there is still colorado canion of gap between 2013 CGI and 2nd half of 2022 launched definitive edition maxed.
Downgrades are downgrades, and they impact even gameplay demos. See Watch Dogs, Division, Anthem, TLOU2, etc. It's just an industry reality. Witcher 3's downgrade was bad, but id argue games like RDR2, TLOU2 and Ghost of Tsushima topped that original Witcher 3 reveal trailer. Anthem was light years ahead of anything on consoles in 2017, but HFW looks better. Watch Dogs was downgraded, but Infamous looked better and it came out the same year.

Point is we can assume that these games will get downgraded, and still assume the best studios will eventually get close to those graphics. Massive downgraded Division from its original reveal, but the game still looks amazing to this day.

Lastly, CD Project were very open about that witcher 3 downgrade admitting that they had to change their entire rendering engine to get the game running on consoles. With cyberpunk, they did not downgrade the PC version to get that game running on consoles, and while the console versions suffered and ran like complete dogshit, the PC version looked BETTER the original reveal. Only the number of NPCs were downgraded, but everything else was upgraded in the final version.

Ux7gs1e.gif
 

SlimySnake

Flashless at the Golden Globes
This gen has taught me to stop caring about graphics. There are PS4 games that look as good as current gen titles. Problem is many games are going open world and that also makes it difficult to make great graphics.
I don't even know what next gen graphics will look like, nothing I've seen so far has really impressed.
Best way to increase immersion at this point is more fluid animation and realistic reactions to environment and physics. Perhaps we will be wowed at this upcoming Xbox event but I can't see it, there are already games that max out these consoles at 4K30 and they look good but not "next gen".
One area of innovation I've seen is the RE engine. The graphics are not that great for environments, but for character models they are quite impressive, especially in DMC5 and RE games. Really brings the characters to life.
Preach. If devs wanted to stick with last gen graphics, I wouldve been ok if it was due to animations and NPC character rendering like TLOU2, and full blown physics systems. Sadly we have seen none of that. One of my biggest disappointments with the TLOU remake was that they didnt bother adding the incredible NPC character models in a PS5 only remake built on the same engine. They had that running on the PS4 Pro, the PS5 is like 3x more powerful. God that remake is just one giant disappointment.

Spiderman 2 is doing absolutely nothing new with NPC simulations. Its your standard NPC simulation we have seen since the PS3 days. Hell, Infamous on PS3 had better NPC interactions where they would get sucked into tornadoes and get thrown about with cars, trash and other objects on the streets. Something Sucker Punch inexplicably removed in the PS4 games presumably due to the poor jaguar CPUs but now? whats the excuse now?
 

SlimySnake

Flashless at the Golden Globes
The shading on the right is gorgeous.
Apparently the Idris Alba Cyberpunk DLC is third person since hes the playable character. I think the first person view actually held back CP's visuals. The game always looked better to me in while i was driving around in the car.

People like Alex Battalia say that the first person view is more immersive, but i will always point to witcher 3 vs cyberpunk, and ask which game did a better job selling its atmosphere and visual design. Lets just hope CP has better animations than Witcher 3.
 

GymWolf

Gold Member
Apparently the Idris Alba Cyberpunk DLC is third person since hes the playable character. I think the first person view actually held back CP's visuals. The game always looked better to me in while i was driving around in the car.

People like Alex Battalia say that the first person view is more immersive, but i will always point to witcher 3 vs cyberpunk, and ask which game did a better job selling its atmosphere and visual design. Lets just hope CP has better animations than Witcher 3.
Cb is way more immersive, witcher 3 has just way better story and characters so you feel more immersed.
 
Last edited:
If yes then we gotta be real with ourselfs, cinematic trailer shouldnt be benchmark of what is expected from current gen consoles.
Did we expect ps1 games to look like soul blade/edge intro? Ofc not, not even close.

Nowadays the difference between cgi and gameplay/in engine cutscenes isnt as big as it was 25 years ago but its still clearly visible.

Im confident even casuals or non gamers can clearly see difference for example between witcher3 cgi trailer or
vs bullshot trailer that pretends to be gameplay like this famous beauty or actual real gameplay trailer, that looks visibly worse

I used witcher 3 just so we can dismiss doubts about quality of dev studio, and since its independend game hopefully no need for console warring either, on top game came out may 2015 so plenty of time to reflect on its actual graphics/cinemtatics/cgi, and it even had next gen update on current gen consoles+ PC version got even bigger upgrade with all kinds of RT and mods put directly into it, which looks like this

To quickly sum it up, even next gen version on pc, maxed, on a mashine that currently costs close to 3k usd, still looks bit worse (maybe not overall but in some places for sure)from bullshot trailer from dec 2013 so 1,5year before games launch.
Obviously no point comparing actual game to cgi trailers at all coz there is still colorado canion of gap between 2013 CGI and 2nd half of 2022 launched definitive edition maxed.


Looking at that I'm reminded of the utter dishonesty of these companies. Notice at the end of the first bullshot "gameplay" trailer the first words that pop up after are "xbox one/ps4" yet today's highest end PC-10 years later, can only maybe kinda sorta match those visuals.
 
In summary, but in less kind words than NXG: Pretty underwhelming. Incremental upgrades not possible on PS4, but ultimately chained to an ultimately PS4 baseline.

It sucks that they didn't overhaul their engine; having better fluid simulation would've enhanced what they're trying to do with their black suit design - which has fully grown on me.
It's funny how everyone was losing their minds over Ratchet and Clank and Miles Morales but now a far more ambitious game arrives from Insomniac and everyone saying it's a PS4 game? It's the same engine it's just doing more now than it did before.
 
Last edited:

ProtoByte

Weeb Underling
It's funny how everyone was losing their minds over Ratchet and Clank and Miles Morales but now a far more ambitious game arrives from Insomniac and everyone saying it's a PS4 game? It's the same engine it's just doing more now than it did before.
Well to be fair, Miles Morales was cross gen and will be 3 years old by the time this game comes out, and Ratchet was a smaller project released 2.5 years earlier into the console's life cycle. Expectations here are different. Especially when you've got something like Forbidden West to compare this too; itself a cross gen game.

To be clear, I never said Spider-Man 2 is a PS4 game. I said it has a PS4 baseline. Other than being obvious to anyone being honest with what they're seeing, it's the middle ground between "this is so last gen" and "this looks fantastic, so much better than Miles Morales".
 
Last edited:
Well to be fair, Miles Morales was cross gen and will be 3 years old by the time this game comes out, and Ratchet was a smaller project released 2.5 years earlier into the console's life cycle. Expectations here are different. Especially when you've got something like Forbidden West to compare this too; itself a cross gen game.

To be clear, I never said Spider-Man 2 is a PS4 game. I said it has a PS4 baseline. Other than being obvious to anyone being honest with what they're seeing, it's the middle ground between "this is so last gen"
Forbidden West doesn't even look nextgen and in fact uses Less next gen graphical effects than Spider-Man.
 

GymWolf

Gold Member
It's funny how everyone was losing their minds over Ratchet and Clank and Miles Morales but now a far more ambitious game arrives from Insomniac and everyone saying it's a PS4 game? It's the same engine it's just doing more now than it did before.
Nobody was losing their mind over morales except people who get excited for everything that has a sony logo in the cover, cmon...
 

alloush

Member
In summary, but in less kind words than NXG: Pretty underwhelming. Incremental upgrades not possible on PS4, but ultimately chained to an ultimately PS4 baseline.

It sucks that they didn't overhaul their engine; having better fluid simulation would've enhanced what they're trying to do with their black suit design - which has fully grown on me.
Perfectly put. Like I watched some of it again, whilst there are some parts that looked good the entirety of the game feels PS4ish!
 
Last edited:

analog_future

Resident Crybaby
Bold prediction time:


I think there will be multiple titles at the Xbox Games Showcase that blow our socks off, and our optimism towards the future of current gen visuals will be much higher after this Sunday.
 
Bold prediction time:


I think there will be multiple titles at the Xbox Games Showcase that blow our socks off, and our optimism towards the future of current gen visuals will be much higher after this Sunday.

I still have a tinfoil hat theory that Sony are saving a lot of their big cards (including next-gen goodies) for later this year. Just as a way to stay ahead and strangle Xbox's mindshare.
 
Top Bottom