• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

SlimySnake

Flashless at the Golden Globes
And I laugh at the idea that folk around here think that a PS5 Pro will fix their precious 60 fps modes. Not going to happen.
if the Pro has a better CPU and a roughly 60% increase in GPU power like the leaks suggest, not to mention a DLSS like upscaling solution, it will be a huge improvement to 60 fps modes. you will get way better image quality thanks to AI upscaling solutions alone. The biggest issue is that the games are dropping to 720p and having a better GPU and CPU will help them get closer to 1080p at which point the AI upscaling will make 1080p look better than what we get in FSR2.0's 1440p solutions we are getting in 30 fps modes.

Beside the odd texture here and there, FF7 Rebirth is what I expected from this generation.
Why base this off of one bad game? Rise of Ronin looks even worse. Would you say Rise of Ronin is what you expected from this generation.

I wouldnt write off the whole gen based on one game. What about Avatar? Alan Wake 2? Starfield? Star Wars? hell, even FF16 looks better than this.

When hellblade 2 comes out in two months and takes a big giant shit all over Rebirth, the bar wouldve been raised.
 

FoxMcChief

Gold Member
if the Pro has a better CPU and a roughly 60% increase in GPU power like the leaks suggest, not to mention a DLSS like upscaling solution, it will be a huge improvement to 60 fps modes. you will get way better image quality thanks to AI upscaling solutions alone. The biggest issue is that the games are dropping to 720p and having a better GPU and CPU will help them get closer to 1080p at which point the AI upscaling will make 1080p look better than what we get in FSR2.0's 1440p solutions we are getting in 30 fps modes.


Why base this off of one bad game? Rise of Ronin looks even worse. Would you say Rise of Ronin is what you expected from this

I wouldnt write off the whole gen based on one game. What about Avatar? Alan Wake 2? Starfield? Star Wars? hell, even FF16 looks better than this.

When hellblade 2 comes out in two months and takes a big giant shit all over Rebirth, the bar wouldve been raised.
I’m saying FF7 was my expectation. Not as a bad thing. I think it looks great at times. Especially with its scale/scope.
 

SlimySnake

Flashless at the Golden Globes
I’m saying FF7 was my expectation. Not as a bad thing. I think it looks great at times. Especially with its scale/scope.
it looks fine but like winjer said they arent even using the latest version of UE4, let alone UE5.

you should expect more.
 

mrqs

Member
How anyone can describe this as anything other than next-gen is beyond me. Easily the best LOD management I've ever seen in a game.

Personally I don't like the final image, it's a bit too sharp and I don't like the art style. It's good looking for sure, mainly these flying over the world segments, but I still felt disappointed.

Also, the game feels too fast and very light, you almost never can truly take it in. But that was a problem with 2018s Spider-Man too.

I know this isn't that popular of an opinion but I really thought it would be more. The story was sub-par also.
 

FoxMcChief

Gold Member
it looks fine but like winjer said they arent even using the latest version of UE4, let alone UE5.

you should expect more.
Eh. Devs haven’t given me a reason to expect more. They aren’t even using the hardware that’s out properly. Why should I expect them to use new engines well, or the new stop-gap hardware well?
 

ChiefDada

Gold Member
DF working on a lot of embargoed content maybe/hopefully some of it related to GDC?? This thread is on life support right now.

IHKOMHI.jpg
 

CGNoire

Member
its vram limitation and nothing else

you need more memory for texture leaps. they cant upgrade textures, and actually textures get hit because developers are trying to squueze more varied textures in larger open world settings now. only games where we see massive texture improvements are games that have PS3 world design (alan wake 2, a plague tale requiem etc.)

ps1 3 mb total ram
ps2 total 36 mb ram, 12x memory increase = massive leap in graphics
ps3 total 512 mb ram, 14x memory increase = massive leap in graphics
ps4 total 8192 mb ram, 16x memory increase = great leap in graphics
ps5 total 16384 mb ram, puny 2x memory increase = almost no leap in graphics at all

ps4 packed 8 gb of total memory when NVIDIA was selling 3 gb high end GPUs (780ti). ps4 had 2.7x more memory than a titan class GPU of its time. (6 gb version of that gpu came later and first released product was the 3 gb version IIRC)

titan class 3090 in 2020 packed 24 gb memory which is 1.5x more memory than the PS5. While PS4 had much memory than titan class GPU, PS5 has actually less memory than the titan GPU of this day. if ps5 had memory setup like ps4, it would've at least gotten 64 GB memory (24*2.7x) and 64 gb would've been a decent 8x memory increase over PS4 that would give us insane leaps in graphics memory is super cheap. PS4 had no rights to have 8 gb vram when it did. but it did. and results were impressive.

and no one should tell me moore's law or something. if ps4 could have 16x memory of ps3, a bigger leap than ps1 to ps2, you cannot explain a mere 2x memory increase. it should've been AT LEAST 4x increase and land at 32 GB.

Instead we have Series S. This entire gen will be limited by funny memory budgets.
This . Been saying thia since befor le ps5 launch 2x jump is absolutely pathetic and was always gonna be this gens achiles heel. But all I got was ssd magic this and ssd magic that.
 

Represent.

Represent(ative) of bad opinions
DF working on a lot of embargoed content maybe/hopefully some of it related to GDC?? This thread is on life support right now.

IHKOMHI.jpg
Its gonna be a shit year for gaming graphics.

No shade but its the year of the Japanese games, and they simply suck at graphics now aside from Kojima.

We'll be in heaven next year though with GTA 6, Death Stranding 2, and undoubtedly some other Sony First Party heavy hitters will release.
 

SlimySnake

Flashless at the Golden Globes
This . Been saying thia since befor le ps5 launch 2x jump is absolutely pathetic and was always gonna be this gens achiles heel. But all I got was ssd magic this and ssd magic that.
As a captain of team 14 tflops before launch, i despised the focus on ssd and io nonsense because i knew the gpu was the most crucial thing about the next Gen Consoles. The problem is the ps5 is already pushing 220-230 watts. 80 watts over the base ps4 so not much they could do to get to 14. You could have reduced the clock speeds to get more tflops for the same wattage but ms did that and it blew up in the face.

Basically there is nothing they could’ve done even if they had targeted 600 dollar price points. No one wants a 300 watt console.

Lastly, the jump from base ps4 was 5x and almost 8x after you count the ipc gains from ps4 to Polaris to rdna architecture. I think the problem is the devs not the console. They keep targeting native 4k instead of 1440p like the og ue5 and matrix demos.
 

E-Cat

Member
Lets just say you will be having sex with robots that look like margot robbie before game devs get to that level of fluid simulations.
Oh, please. If you have the technological capability to make sex bots that look like Margot Robbie, you will have AI that can just neurally hallucinate the requisite physical simulations. It doesn't matter if it's accurate per particle, it just has to look real. OpenAI's SORA is an extremely early example of this.

1*U1vXwf40qwLljJERkgmV5A.gif
 
Last edited:

rofif

Can’t Git Gud
And that’s why we get games like FF7 Rebirth with its messy graphical modes. Neither one is amazing or optimized. Fantastic game, but man does it have some inconsistencies. That’s why I think it will be par for the course this generation.

And I laugh at the idea that folk around here think that a PS5 Pro will fix their precious 60 fps modes. Not going to happen.
While I am not a big fan of modes on consoles since it’s choosing between compromises and. Or personal preferences, the 30fps mode in rebirth is about as good as you can get. Image is very stable and high res. 30 is perfectly executed as it’s the low lag type. Plays perfect and not a single crash. It plays so well at 30, Ingotnised to it very quickly.

The problem with rebirth is us textures streaming bugs and lack of camera motion blur which would help with 30fps even more but that’s ok because there is very good animation motion blur.
 

IDWhite

Member
I thought the superfast SSDs (compared to whatever last gen had, anyway) answered some of this concern? ie you don't need as much cos you can swap stuff in and out so much quicker.
Whit SSDs and dedicated I/O hardware you can manage memory more efficiently without touching game engine code, but that's not enough when you want high fidelity visuals on complex, big open world games on a 13GB RAM machine. If you want to take full advantage of the SSD and I/O hardware then is necessary to rewrite a huge part of the engine.

The Cerny's idea on the design on the Ps5 hardware on this subject implied a huge paradigm shift on streaming, memory management, loading... That idea of constantly move GB's of data in and out of memory as you turn the camera are not accomplished by anyone, and no one is going to do so with the actual game design situation.
 
Last edited:

Bojji

Member
So basically it was all bollocks.

Devs are super lazy when it comes to paradigm shifting.

It may become true in the future and maybe some PS5 games will actually use it.

Like Blu ray on PS3 was useless for 90% of games but some games really needed that amount of data for all assets (and HQ videos).
 

SlimySnake

Flashless at the Golden Globes
Oh, please. If you have the technological capability to make sex bots that look like Margot Robbie, you will have AI that can just neurally hallucinate the requisite physical simulations. It doesn't matter if it's accurate per particle, it just has to look real. OpenAI's SORA is an extremely early example of this.

1*U1vXwf40qwLljJERkgmV5A.gif
lol the games industry is decades behind others industries. They can do Matrix quality graphics and then give us visual masterpieces like rebirth and rise of ronin.

Even if the rest of the world can get there, the industry with its complete lack of talent and ambition will take more time to get there.

Regardless, it was a joke.
 

SlimySnake

Flashless at the Golden Globes
So basically it was all bollocks.

Devs are super lazy when it comes to paradigm shifting.
Of course it was. The thing is no one is going to resign games to fit crrnys vision. That’s why i always thought it was nonsensical to focus so much on the ssd and io. Most devs make third party games anyway and even first party studios were never going to change the way they make games just to appease cerny.

Only insomniac has used the ps5 ssd and io for its portals sections but even they don’t utilize more than a 1gb or 2. It’s a complete waste. Some focus on extra cpu cache or machine learning cores would’ve helped every single game.
 

IDWhite

Member
Of course it was. The thing is no one is going to resign games to fit crrnys vision. That’s why i always thought it was nonsensical to focus so much on the ssd and io. Most devs make third party games anyway and even first party studios were never going to change the way they make games just to appease cerny.

Only insomniac has used the ps5 ssd and io for its portals sections but even they don’t utilize more than a 1gb or 2. It’s a complete waste. Some focus on extra cpu cache or machine learning cores would’ve helped every single game.

The Cerny's idea is completely logical from a hardware and software engineer perspective, because the biggest limitation in computing is not the power of the processors but the speed and latency with which the data moves. Older hard drives and I/O protocols were by far the biggest limitation.

That said, It must also be taken into account that it is impossible to predict where the industry will evolve in the coming years.
 

SlimySnake

Flashless at the Golden Globes
The Cerny's idea is completely logical from a hardware and software engineer perspective, because the biggest limitation in computing is not the power of the processors but the speed and latency with which the data moves. Older hard drives and I/O protocols were by far the biggest limitation.

That said, It must also be taken into account that it is impossible to predict where the industry will evolve in the coming years.
I disagree. The biggest limitation is the GPU horsepower. It will always be the biggest limitation.

The SSDs wouldve solved the data transfer issue. You can run ratchet on a 500 MBps ssd. hell, you can run it on a HDD as long as you have a fast enough cpu.

Besides, nanite solved the data transfer bottlenecks just in a different way. I commend cerny trying to solve a problem with an elegant solution, but that was a 2016 problem. That is not whats holding back the PS4 in 2024 as we see games run at terrible resolutions with awful image quality because of the lack of a meaningful GPU upgrade. Though like I said, i dont blame cerny for that one. AMD gave him no option.

Speak for yourself.
i want a $800 400 watt console, but im pretty sure there are eu regulations not to mention costs related to cooling a 300 watt console alone would be substantial.

It will be interesting to see just how far Sony is willing to push the Pro. if it hits 250-260 watts then we just might get a 300 watt PS6 as long as EU is ok with it.
 

shamoomoo

Member
So basically it was all bollocks.

Devs are super lazy when it comes to paradigm shifting.
Yes and no. Not every game dev is going to have the same level of experience nor is every game going to require the capability of the PS5. Cerny wanted to have few bottlenecks between the hardware and how developers wants to make their games.
 

GymWolf

Gold Member
Yes and no. Not every game dev is going to have the same level of experience nor is every game going to require the capability of the PS5. Cerny wanted to have few bottlenecks between the hardware and how developers wants to make their games.
I was talking broadly, and it's not like sony studios have showed much of the ssd sauce other than spidey having fast traversal or the gimmick portal thing in ratchet (that are both not even related to swapping high texture quality to not have low res assets anymore like cerny promised)

Also, in a console where 98% of games are third party, creating a tech that only some super talented internal party can use, it's kinda of a trash tech if you ask me, ps3 was trolled hard for being a console that only first party could max out.
 
I was talking broadly, and it's not like sony studios have showed much of the ssd sauce other than spidey having fast traversal or the gimmick portal thing in ratchet (that are both not even related to swapping high texture quality to not have low res assets anymore like cerny promised)

Also, in a console where 98% of games are third party, creating a tech that only some super talented internal party can use, it's kinda of a trash tech if you ask me, ps3 was trolled hard for being a console that only first party could max out.
Only difference is first parties would actually use the Cell and make industry leading games :/ no first party is really using the SSD IO except insomniac arguably :/ it’s lame as hell
 

IDWhite

Member
I disagree. The biggest limitation is the GPU horsepower. It will always be the biggest limitation.

The SSDs wouldve solved the data transfer issue. You can run ratchet on a 500 MBps ssd. hell, you can run it on a HDD as long as you have a fast enough cpu.

Besides, nanite solved the data transfer bottlenecks just in a different way. I commend cerny trying to solve a problem with an elegant solution, but that was a 2016 problem. That is not whats holding back the PS4 in 2024 as we see games run at terrible resolutions with awful image quality because of the lack of a meaningful GPU upgrade. Though like I said, i dont blame cerny for that one. AMD gave him no option.

GPU hoserpower is another limitation, but you need data sent to processing to make use of it and that comes from memory, no matter if it is primary (RAM, VRAM...) or secondary (HDDs, SDDs...)

You can have a huge GPU running at 100% wasting half the cycles waiting for the necessary data. So increment GPU or CPU capabilities without addressing memory bottlenecks first are a waste of resources.
 

GymWolf

Gold Member
GPU hoserpower is another limitation, but you need data sent to processing to make use of it and that comes from memory, no matter if it is primary (RAM, VRAM...) or secondary (HDDs, SDDs...)

You can have a huge GPU running at 100% wasting half the cycles waiting for the necessary data. So increment GPU or CPU capabilities without addressing memory bottlenecks first are a waste of resources.
And this is why we needed an higher price point, so we could have both ssd and gpu performance.

The console is unbalanced.
 

SlimySnake

Flashless at the Golden Globes
Only difference is first parties would actually use the Cell and make industry leading games :/ no first party is really using the SSD IO except insomniac arguably :/ it’s lame as hell
Yep. Sony has released several third party exclusives like FF7, FF16, Forspoken, Deathloop, Ghostwire Tokyo, and no dev bothered to do anything with the IO.

Forspoken wouldve been the perfect contender for this massive data push tech cerny championed as it was a game with very fast traversal that needed to swap in data in the vram constantly. But even back then I was pointing out that while pushing the data from ssd to vram is great and all, it is the goddamn GPU that has to render the geometry and the foliage. And guess what? Forspoken kept getting downgraded over and over again with each trailer and by the time the game came out, it was virtually empty and had none of the high fidelity assets, foliage and graphics from the original reveal. Because the SSD and IO doesnt render the graphics. The GPU does.

GPU hoserpower is another limitation, but you need data sent to processing to make use of it and that comes from memory, no matter if it is primary (RAM, VRAM...) or secondary (HDDs, SDDs...)

You can have a huge GPU running at 100% wasting half the cycles waiting for the necessary data. So increment GPU or CPU capabilities without addressing memory bottlenecks first are a waste of resources.
Sure, but like i said, even a SATA ssd with a 500 MBps would fix that. You can run Matrix on a 300 MBps ssd. Some people were actually running it on a HDD although that required a powerful cpu.

5.5 GBps is overkill. He shouldve spent his R&D budget on nvidia quality ray tracing and DLSS tech instead. It's clear AMD is completely fucking clueless unable to match Nvidia's RT performance from 2018. Literally 6 years behind. you would think Sony's engineers would be able to help AMD close the gap, but cerny was too busy chasing this ridiculous IO and 5.5 GBps SSD tech, and MS was too consumed with winning the tflops war.

Lets hope he's learned his lesson and the PS5 Pro has machine learning upscaling and way better ray tracing support so we can play GTA6, Star Wars Outlaws and Death Stranding with good image quality at 60 fps.
 

hlm666

Member
If you only have the theory, how the fuck do we know if it is actually doable without nobody doing it?

So for now, it remains bollocks and pr talk.
The funny thing with that whole concept is it completely fails in certain situations. Lets take spiderman with RT reflections, if your only loading assets in a JIT style frames before you need it all your RT reflections are now blank. Someone will say screw RT dump it for higher quality assets with the concept, that's great now you have reflections which look worse and possibly perform worse if your trying to push higher ssr quality.

Now a gameplay example, if your only loading assets when they are in the players view how do you bounce a grenade or something around a corner when nothing there is loaded yet? So we can now only load textures and higher lod objects because we need something there so the whole thing about using the whole available memory for whats on the screen has now been hamstrung, and even then we are conveniently ignoring whatever the program and audio are using. It never made sense when you think about how games work.

More memory like people have been saying was the only real way to pull off the pie in the sky visuals.
 

IDWhite

Member
The funny thing with that whole concept is it completely fails in certain situations. Lets take spiderman with RT reflections, if your only loading assets in a JIT style frames before you need it all your RT reflections are now blank. Someone will say screw RT dump it for higher quality assets with the concept, that's great now you have reflections which look worse and possibly perform worse if your trying to push higher ssr quality.

Now a gameplay example, if your only loading assets when they are in the players view how do you bounce a grenade or something around a corner when nothing there is loaded yet? So we can now only load textures and higher lod objects because we need something there so the whole thing about using the whole available memory for whats on the screen has now been hamstrung, and even then we are conveniently ignoring whatever the program and audio are using. It never made sense when you think about how games work.

More memory like people have been saying was the only real way to pull off the pie in the sky visuals.

Custom data structures + frustum culling
 

SlimySnake

Flashless at the Golden Globes
I would highly recommend everyone watch the physics/destruction discussion of this Ghost Recon 2006 in the DF retro video.

Not only were the able to destroy cars, windows, fences, signs and crates, but the devs also added little details like tires catching fire, trees swaying after explosions even if they dont get destroyed, and of course an entire physx pipeline on PC with amazing debris and smoke effects you didnt get on console.

it is insane to see how far we have gone back. This is literally the first big game on the Xbox 360. In the PS4 era, devs had the jaguar CPU excuse, no excuse today. 3 years in, no one bothered adding any of these little touches to their games. They rendered them at native 4k, slapped on RT and called it a day. Shameful.

Timestamped:
 

rofif

Can’t Git Gud
I would highly recommend everyone watch the physics/destruction discussion of this Ghost Recon 2006 in the DF retro video.

Not only were the able to destroy cars, windows, fences, signs and crates, but the devs also added little details like tires catching fire, trees swaying after explosions even if they dont get destroyed, and of course an entire physx pipeline on PC with amazing debris and smoke effects you didnt get on console.

it is insane to see how far we have gone back. This is literally the first big game on the Xbox 360. In the PS4 era, devs had the jaguar CPU excuse, no excuse today. 3 years in, no one bothered adding any of these little touches to their games. They rendered them at native 4k, slapped on RT and called it a day. Shameful.

Timestamped:

true. It's because of RT.
Not only does reflections look worse than in max payne 2 or half-life 2 but it costs much more.

2004. perfect reflections, even on dynamic objects out of view.
a8tvepI.jpeg


I don't care if it's planar or double render or whatever. These reflecitons are still better than any rt game today.
Alan Wake 2 had to resort to vaseline mirrors and puddles or grainy barely constructed ssr on consoles.

As for physics and other stuff. True too.
Funnily enough, I am very surprised about ff7 rebirth NPC count only to remember Dead Rising from 2006
edit: btw - I just bouht 3 different 8800 cards :D Why not! I had 8800gts back in the day and it was incredible. Time when there were only few gpus in the ladder and all were good. 8800 gts and gtx were both awesome. 8800gt was half the price a little bit than year later and better than gtx
My 8800 gts was 320mb one from ASUS. The one with ghost recon which I still have the disc for !!! sadly, i don't have the exact card itself anymore.
2007 photo baby!
NBXukuC.jpeg


edit: also - graw on 360 looks way better than pc version physics aside
 
Last edited:
I would highly recommend everyone watch the physics/destruction discussion of this Ghost Recon 2006 in the DF retro video.

Not only were the able to destroy cars, windows, fences, signs and crates, but the devs also added little details like tires catching fire, trees swaying after explosions even if they dont get destroyed, and of course an entire physx pipeline on PC with amazing debris and smoke effects you didnt get on console.

it is insane to see how far we have gone back. This is literally the first big game on the Xbox 360. In the PS4 era, devs had the jaguar CPU excuse, no excuse today. 3 years in, no one bothered adding any of these little touches to their games. They rendered them at native 4k, slapped on RT and called it a day. Shameful.

Timestamped:

Uhhh elephant in the room Mr. SlimySnake. This game was from a long bygone era when gaming was a nerd enthusiast hobby with enthusiast nerds making these games. There was drive and ambition around every corner and under every stone

Today it is a mass market casualized abattoir where normies come to play lowest effort sports games/COD/fornite and suits instruct their studios to try to capture that same market. Employees generally are tepid on their jobs and leave as soon as possible unless mandated. Much more concerned that we get a black lesbian to kiss on screen somehow than to push physics and graphics.

This is speaking broadly and there are exceptions of course but not many in my opinion
 
Last edited:
Tinfoil hat time: do the first parties not really leverage and utilize the SSD IO like they did the Cell CPU back in the day because they know it’ll have to be ported to run on a slow ass SSD or even a high end HDD with a damn 1060 or something to match it?

As opposed to back in the day when that shit was locked on PS3, fully exclusive, no exceptions? I wonder.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Tinfoil hat time: do the first parties not really leverage and utilize the SSD IO like they did the Cell CPU back in the day because they know it’ll have to be ported to run on a slow ass SSD or even a high end HDD with a damn 1060 or something to match it?

As opposed to back in the day when that shit was locked on PS3, fully exclusive, no exceptions? I wonder.
Well, we wont know until first party devs actually start publishing PS5 only games. So far its only been insomniac. ND, SSM, GG, Bend and Sucker Punch either havent released a game or settled for last gen games bottlenecked by the HDDs.

Back in the day, the PS3 was not selling and they needed exclusives to keep up with the 360 which was producing one hit after another.
 

GymWolf

Gold Member
I would highly recommend everyone watch the physics/destruction discussion of this Ghost Recon 2006 in the DF retro video.

Not only were the able to destroy cars, windows, fences, signs and crates, but the devs also added little details like tires catching fire, trees swaying after explosions even if they dont get destroyed, and of course an entire physx pipeline on PC with amazing debris and smoke effects you didnt get on console.

it is insane to see how far we have gone back. This is literally the first big game on the Xbox 360. In the PS4 era, devs had the jaguar CPU excuse, no excuse today. 3 years in, no one bothered adding any of these little touches to their games. They rendered them at native 4k, slapped on RT and called it a day. Shameful.

Timestamped:

Doesn't mercenaries 2 make this one obsolete? and that one was open world...
 
GIfiOveWwAIoOvc


GIfiUV6W0AAB8cN

GIfiWJiXEAEtNWK


GIfiSKWWYAAqCcu



This Unreal Engine 5 game looks too good. Are we finally getting true next-gen quality now? People are really getting their hands on unreal, and it shows.



Never heard of this tho.

If they actually deliver these graphics in real time on console I’m very impressed. This is what I think when I think PS5 visual fidelity.

Also cmon lol these no name studios punking the biggest names makes me scratch my head at the “budget and manpower and time” arguments. I really do think passion and effort play a bigger factor than a lot of people give credit to
 

SlimySnake

Flashless at the Golden Globes
Doesn't mercenaries 2 make this one obsolete? and that one was open world...
Thats not the point. the point is that even games that didnt revolve around destruction featured way more interaction with the environment than they do today.
 

GymWolf

Gold Member
GIfiOveWwAIoOvc


GIfiUV6W0AAB8cN

GIfiWJiXEAEtNWK


GIfiSKWWYAAqCcu



This Unreal Engine 5 game looks too good. Are we finally getting true next-gen quality now? People are really getting their hands on unreal, and it shows.



Never heard of this tho.

I mean...there is literally zero gameplay in this trailer...

At best those are real time cutscenes, at worse it's an in engine demo like fable.
 
Last edited:
Well, we wont know until first party devs actually start publishing PS5 only games. So far its only been insomniac. ND, SSM, GG, Bend and Sucker Punch either havent released a game or settled for last gen games bottlenecked by the HDDs.

Back in the day, the PS3 was not selling and they needed exclusives to keep up with the 360 which was producing one hit after another.
Decent rebuttal but I would actually posit that GG and SSM releasing cross gen garbage is exactly what I’m talking about. They could’ve pushed the envelop and didn’t. In fact they were very open in PS blogs that yes, our games are glorified PS4 games, manage your expectations. That’s exactly what I’m talking about sadly

I’d imagine that was largely in service of appealing to the widest market possible. The PS4 guys who didn’t upgrade. Putting profit over advancements.

And the counter of “oh well there weren’t many PS5s out at the ti-“ NO. I don’t buy it. There were more PS5s out than PS3s and PS4s in the equivalent time frame and we were getting driveclub, the order, second son, Killzone, last guardian and much more. PS3 was on the verge of flopping in its early years and still delivered high profile exclusives that could never be done on PS2
 
Last edited:

rofif

Can’t Git Gud
GIfiOveWwAIoOvc


GIfiUV6W0AAB8cN

GIfiWJiXEAEtNWK


GIfiSKWWYAAqCcu



This Unreal Engine 5 game looks too good. Are we finally getting true next-gen quality now? People are really getting their hands on unreal, and it shows.



Never heard of this tho.

This is cgi my sir. Looks fully achievable but only for rendered. Not with this image quality and desity.
if this game will look anything like that if it releases, I will be surprised.
It's like saying this is how gears of war 2 looks like lol
8fqlopg.jpeg


Bit of a pessimistic tangent / side post.
To me, next gen graphics / future graphics are these cursed AI videos sora is showing. I bet nvidia will do this in real time and we will skip real rendered graphics 10 years from now or less.
Imagine even if you generate this stuff in 720p at 30fps.... it just looks like real footage.

This shit looks real. If we didn't knew we are looking at AI video, you would not know with some of these vids.
 

GymWolf

Gold Member
Thats not the point. the point is that even games that didnt revolve around destruction featured way more interaction with the environment than they do today.
I don't think we nedeed any more proof of that tbh :lollipop_grinning_sweat:

But tbf, the machines are too weak to have high res\raytracing\nextgen fidelity\destruction\60fps all at once, you can usually pick one and a half, 2 when you are lucky, 3 if you are rockstar (hopefully)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
This is cgi my sir. Looks fully achievable but only for rendered. Not with this image quality and desity.
if this game will look anything like that if it releases, I will be surprised.
It's like saying this is how gears of war 2 looks like lol
It's not CG. Looked realtime in-engine cutscene footage to me. You can tell its realtime when they start moving the camera around. If it was CG, the movement would be a lot more fluid and cinematic.

The screenshots are in game.

That said, yes, the game wont look like that on consoles. Play it on your 3080.
 

SlimySnake

Flashless at the Golden Globes
Decent rebuttal but I would actually posit that GG and SSM releasing cross gen garbage is exactly what I’m talking about. They could’ve pushed the envelop and didn’t. In fact they were very open in PS blogs that yes, our games are glorified PS4 games, manage your expectations. That’s exactly what I’m talking about sadly

I’d imagine that was largely in service of appealing to the widest market possible. The PS4 guys who didn’t upgrade. Putting profit over advancements.

And the counter of “oh well there weren’t many PS5s out at the ti-“ NO. I don’t buy it. There were more PS5s out than PS3s and PS4s in the equivalent time frame and we were getting driveclub, the order, second son, Killzone, last guardian and much more. PS3 was on the verge of flopping in its early years and still delivered high profile exclusives that could never be done on PS2
Nah, im with you. it's basically the same thing. PC holding back or PS4 holding back. At the end of the day, the PS5 is being held back.

I mean...there is literally zero gameplay in this trailer...

At best those are real time cutscenes, at worse it's an in engine demo like fable.
Apparently the Fable trailer was taken straight from the ingame level. At least according to the dev on twitter.

The screenshots are realtime. They fucked up by not showing gameplay. This wouldve made a bigger splash if they had shown even snippets of gameplay. The game is coming out this year so we should see gameplay in a few months. The modeling in the car looks really good even if the woman's character model looks a bit dated.
 

GymWolf

Gold Member
Nah, im with you. it's basically the same thing. PC holding back or PS4 holding back. At the end of the day, the PS5 is being held back.


Apparently the Fable trailer was taken straight from the ingame level. At least according to the dev on twitter.

The screenshots are realtime. They fucked up by not showing gameplay. This wouldve made a bigger splash if they had shown even snippets of gameplay. The game is coming out this year so we should see gameplay in a few months. The modeling in the car looks really good even if the woman's character model looks a bit dated.
I was thinking the same.

There is no way fable is a 2024 game...
 
Last edited:
Top Bottom