• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

King Dazzar

Member
60fps is a higher framerate, so the games PLAY smoother and FEEL better. Input lag will also be improved. They do not LOOK better. Enough :messenger_tears_of_joy:
Smoother animations and less input lag do not equate to a higher level of visual fidelity. Period point blank.
Not trying to step into the middle of a spat here. I fully get what you're saying. But motion is part of fidelity unless you're never going to move or pan the camera. This is why TV's and monitors motion processing is so important. For me its a balance and both contribute to presentation. Higher frame rate will indeed make games play and feel smoother as you say. But it goes beyond latency. You can have resolution on a static frame. But you also have motion resolution, which the higher it is, the better fidelity you will see when panning and moving the camera. That can be done by obviously increasing frame rate. But also enhanced further with TV/monitors processing in conjunction with things like Black Frame Insertion.
 

Lethal01

Member
when the tech demos include NPCs walking around, car destruction physics and all the other minute details that come with a game over a techdemo you kind of have to question why the devs don't just use this in their games.

If only there were hundred of dev talking about why there is a difference demo's like this and a full game production, but all the devs I see saying they are different are lazy and unskilled so I'm forced to ignore them, a shame really.
 

CGNoire

Member
It looks good but the shooting look eh...i don't know.

I feel like shooting at mini xenomorphs is a huge downgrade over fucking up orcs...i know they chose them to replicate the big numbers they did in the world war z series of games, but little xenomorphs are just super uninteresting enemies.

Not gonna lie, if the game is just shooting at these things for 10-15 hours i'm gonna be severely disappointed.
Same. Big fan of SM1 but this has been my biggest concern since SM2 reveal. Im like please tell me there more enemies than just those. Orks had way more variety.
 
You guys gotta stop demanding 60fps on console games late into the gen. You are limiting graphical and design potential with this absurd obsession over framerate. And yes, it is absurd.
Or -

You guys gotta stop demanding 60fps on console games WHILE ALSO WANTING EVER IMPROVING VISUAL FIDELITY while late last gen visually impressive looking games (like Spider-Man, Horizon, Death Stranding, RDRII and TLOU II) were all 1080p at 30fps.

On PS5 and Series X the standard is now 1440p/60fps (with an extra mode that targets 4k/30fps). Not 1080p/30fps. That base alone requires most of the power increase going from PS4 to PS5 nevermind the fact RT is used in many visually impressive games in some way.

This is why I said most of you won't be impressed until PS6/X2 where the base will be 1440p/60fps with RT again but this time they'll have 40-50TFLOP GPU's to play with to push fidelity.

This gen is a halfstep generation imo because of a combination of a relatively low GPU leap, PS4 Pro / XBOXONE X comparisons, covid, chip shortages which in turn cause cross gen to last for three years instead of one year and developers playing with real time Ray Tracing for the first time.
 

SlimySnake

Flashless at the Golden Globes
To me it just doesn't have that "WOW" factor to be the best looking game. It's definitely a generational jump, especially with the amounts of enemies and stuff on screen. The cutscenes are ridiculous, almost CGI level stuff.
Thats fair. I think the pre-downgraded original footage had that insane WOW factor especially that final shot of them flying through pandora. I actually asked the technical director on twitter if that particular area is still in the game, no reply yet so clear downgrade.

I just draw the line at people refusing to acknowledge the clear leap in visual this game has over last gen masterpieces like RDR2 and HFW.

Being the graphics whore that I am (jeez that sounds weird) I opt for game mode off most of the time. Only time I enable that shit is when I am playing sports games or in the rare occasion online.

Having said that, wasn’t there like some kinda alliance formed by the big tech companies regarding offering high level graphics without having the need to enable game mode? I am talking about ALLM.
I forgot to reply to your other post about what settings affect the picture quality the most in game mode compared to vivid or even standard.

First things first, turn off HGIG. I dont care what people say, you dont want accurate HDR, you want BRIGHT HDR. Especially on OLEDs like the LGCX where the peak brightness is only 650 nits instead of the 1000 nit brightness of the infamous KS8000 we all bought when the mid gen consoles came out. HGIG off, Dynamic contrast set to high. That will increase brightness of your tv.

Move the Color slider from 50 to 55. 60 if you can handle it. Vivid settings are typically set to 70 which I find too much but if you dont mind eye popping color then go for it.

Keep your color tone to Neutral or Cool. Everyone recommends Warm for games and movies but while it makes sense for movies, games like horizon and ratchet are colorful and muting their colors by putting a mexican filter on them is doing them a disservice. vivid modes on tvs using Cool for a reason.

keep all the motion plus settings off. However, i did play zelda with some of them enabled based on GamingTech's recommendations and it felt way better than a 30 fps game. there is some minor ghosting but for a game where you arent moving the camera around too fast, you wont notice it. But those motion settings are the main ones that introduce the most input lag so keep them off, they dont impact the picture quality.
 

Hunnybun

Member
Thats fair. I think the pre-downgraded original footage had that insane WOW factor especially that final shot of them flying through pandora. I actually asked the technical director on twitter if that particular area is still in the game, no reply yet so clear downgrade.

I just draw the line at people refusing to acknowledge the clear leap in visual this game has over last gen masterpieces like RDR2 and HFW.


I forgot to reply to your other post about what settings affect the picture quality the most in game mode compared to vivid or even standard.

First things first, turn off HGIG. I dont care what people say, you dont want accurate HDR, you want BRIGHT HDR. Especially on OLEDs like the LGCX where the peak brightness is only 650 nits instead of the 1000 nit brightness of the infamous KS8000 we all bought when the mid gen consoles came out. HGIG off, Dynamic contrast set to high. That will increase brightness of your tv.

Move the Color slider from 50 to 55. 60 if you can handle it. Vivid settings are typically set to 70 which I find too much but if you dont mind eye popping color then go for it.

Keep your color tone to Neutral or Cool. Everyone recommends Warm for games and movies but while it makes sense for movies, games like horizon and ratchet are colorful and muting their colors by putting a mexican filter on them is doing them a disservice. vivid modes on tvs using Cool for a reason.

keep all the motion plus settings off. However, i did play zelda with some of them enabled based on GamingTech's recommendations and it felt way better than a 30 fps game. there is some minor ghosting but for a game where you arent moving the camera around too fast, you wont notice it. But those motion settings are the main ones that introduce the most input lag so keep them off, they dont impact the picture quality.

I actually rewatched the PS5 footage of Avatar last night just to check I wasn't misremembering. I wasn't. I think it looks quite rough. Some very good, some pretty mediocre. As honest as I can be, IMO Ratchet and HBS both generally look significantly better.

It's all very well insisting that Ratchet and Horizon are last gen or cross gen or whatever, but have you actually gone back and played 2016 Ratchet (yes, at upscaled 4k and 60fps, so just purely comparing other aspects)?

I have. It looks much, MUCH, worse than Rift Apart. It's a BIG fucking difference. Obviously it's hard to quantify whether it's as big a leap as Ratchet PS3 to 2016, but it's in that order. For want of a better term, it's a generational leap, as defined by previous generational leaps.

The same goes for Horizon ZD. Go back and play it, again at high res at 60fps. It looks very rough compared to Burning Shores. A generational leap? That's less clear imo. But it's another big fucking difference.

Yes, both games would've looked better in their 30fps modes if approx half the pixels had been sacrificed, but they don't remotely look like last gen games.
 

SlimySnake

Flashless at the Golden Globes
Do the regulars in this thread ever wonder if they’re just intentionally driving themselves crazy, torturing themselves?

Do you ever think you’ll be happy? Do you ever worry this obsession interferes with your ability to love games?
Regulars in this thread have consistently praised recently released games like Star Wars Survivor, FF16 and have been plenty happy with most games revealed at E3 and Gamescom. Star Wars Outlaws, Fable, Avatar, Starfield, Space Marine and Alan Wake 2 have received a lot of praise in this thread.

We are doing fine.

Or -

You guys gotta stop demanding 60fps on console games WHILE ALSO WANTING EVER IMPROVING VISUAL FIDELITY while late last gen visually impressive looking games (like Spider-Man, Horizon, Death Stranding, RDRII and TLOU II) were all 1080p at 30fps.

On PS5 and Series X the standard is now 1440p/60fps (with an extra mode that targets 4k/30fps). Not 1080p/30fps. That base alone requires most of the power increase going from PS4 to PS5 nevermind the fact RT is used in many visually impressive games in some way.

This is why I said most of you won't be impressed until PS6/X2 where the base will be 1440p/60fps with RT again but this time they'll have 40-50TFLOP GPU's to play with to push fidelity.

This gen is a halfstep generation imo because of a combination of a relatively low GPU leap, PS4 Pro / XBOXONE X comparisons, covid, chip shortages which in turn cause cross gen to last for three years instead of one year and developers playing with real time Ray Tracing for the first time.
I think as then gen goes on and top tier developers finally remove themselves from the shackles of last gen, we will see the massive leap we have come to expect. In another thread just yesterday, I listed all the devs who had released games in the first three years of last gen, all but 1 or 2 have been MIA or have been forced to release cross gen games so far this gen.

I think the reason why people are so underwhelmed is because the devs who typically have set those high bars have phoned it in this gen by tying themselves to last gen games like GG did with Horizon or have gone completely MIA like Crytek.

So far we have zero or only last gen output from studios that released games with massive generational leaps last gen in the first 3 years of this gen. This was their output last gen.

- KZSF - GG
- Ryse - Crytek
- Infamous - Sucker Punch
- DriveClub - Evolution Studios
- AC Unity - ubisoft montreal
- The Order - Ready At Dawn
- Battlefield 1 - DICE
- Uncharted 4 - ND
- Witcher 3 - CD Project
- Rise of Tomb Raider - Crystal Dyammics
- Batman Arkham Knight - Rocksteady
- Doom - ID Tech
- Quantum Break - Remedy
- Division - Massive

When 99.99% of these top tier studios havent shown up this gen or have limited themselves to last gen, are we really surprised that we are not seeing a big leap? When ND, Crystal Dynamics, CD project, ID Tech, Sucker Punch and other studios finally show up, people will see what these consoles can truly do.

Thankfully, Remedy and Division are about to release a couple of incredible looking next gen games that will set the bar when they come out.

alan-wake-2-alan-wake.gif


giphy.gif




This thread was made before E3. Ive seen enough at E3 and Gamescom to say that the leap is going to blow people away when these games finally arrive.

People should see a massive generational leap in the gifs below. spoiler tagging them to not keep gifs from loading for everyone else.

9f750f1633c641498f1d3d9381fa77e13a6d92db.gifv

IpEyFxe.gif

JC7qqJt.gif

5eMkKFU.gif


cyvQMBX.gif
3N3Fy42.jpg

FyW_icfX0AEOYNM


FyW_ibyX0AEPvBD
 

alloush

Member
I forgot to reply to your other post about what settings affect the picture quality the most in game mode compared to vivid or even standard.

First things first, turn off HGIG. I dont care what people say, you dont want accurate HDR, you want BRIGHT HDR. Especially on OLEDs like the LGCX where the peak brightness is only 650 nits instead of the 1000 nit brightness of the infamous KS8000 we all bought when the mid gen consoles came out. HGIG off, Dynamic contrast set to high. That will increase brightness of your tv.

Move the Color slider from 50 to 55. 60 if you can handle it. Vivid settings are typically set to 70 which I find too much but if you dont mind eye popping color then go for it.

Keep your color tone to Neutral or Cool. Everyone recommends Warm for games and movies but while it makes sense for movies, games like horizon and ratchet are colorful and muting their colors by putting a mexican filter on them is doing them a disservice. vivid modes on tvs using Cool for a reason.

keep all the motion plus settings off. However, i did play zelda with some of them enabled based on GamingTech's recommendations and it felt way better than a 30 fps game. there is some minor ghosting but for a game where you arent moving the camera around too fast, you wont notice it. But those motion settings are the main ones that introduce the most input lag so keep them off, they dont impact the picture quality.
Amazing, thanks a lot man. I’ve been meaning to ask this question for ages but never got the chance to.

I really hate game mode, it results in huge degradation to graphics and I disable it 90% of the time. The picture becomes too dimmed and you see halo and shit around the character or any moving object for that matter as everything related to motion settings is turned off.

I will try your settings, hopefully they give me the best of both worlds I have been looking for so dearly.
 
Last edited:

GymWolf

Gold Member
Yes. You missed 3 days of downplaying, gifs, and downgrade talk.
Yt compression is to high to judge finer details, sometimes it looks better than horizon, sometimes worse.

This thing is gonna succeed if it manage to be more even than horizon, highly nitpicked trailers are not gonna show if half of the game looks meh.

The moment i'm gonna start to see too many meh characters models, creatures or overall bad textures here and there then the game is gonna fall into crossgen catagory like horizon.

Sorry not sorry.
 
Last edited:

Lethal01

Member
I forgot to reply to your other post about what settings affect the picture quality the most in game mode compared to vivid or even standard.

First things first, turn off HGIG. I dont care what people say, you dont want accurate HDR, you want BRIGHT HDR. Especially on OLEDs like the LGCX where the peak brightness is only 650 nits instead of the 1000 nit brightness of the infamous KS8000 we all bought when the mid gen consoles came out. HGIG off, Dynamic contrast set to high. That will increase brightness of your tv.

Move the Color slider from 50 to 55. 60 if you can handle it. Vivid settings are typically set to 70 which I find too much but if you dont mind eye popping color then go for it.

Keep your color tone to Neutral or Cool. Everyone recommends Warm for games and movies but while it makes sense for movies, games like horizon and ratchet are colorful and muting their colors by putting a mexican filter on them is doing them a disservice. vivid modes on tvs using Cool for a reason.

keep all the motion plus settings off. However, i did play zelda with some of them enabled based on GamingTech's recommendations and it felt way better than a 30 fps game. there is some minor ghosting but for a game where you arent moving the camera around too fast, you wont notice it. But those motion settings are the main ones that introduce the most input lag so keep them off, they dont impact the picture quality.

Accurate HDR is way better than bright HDR whenever you aren't playing in a bright room or got your tv outside. The less autoshit the better, I just go filmmode and sometimes change the depth from 55 to 60 if I'm really feel spicy. I'm not here to make every game look like a cartoon, I got stylized games for that, if the game is meant to be colorful then it will look colorful at the normal settings, no need to boost things, it's like lowering your black level to 10 for playing horror games..
 
Last edited:
Thats fair. I think the pre-downgraded original footage had that insane WOW factor especially that final shot of them flying through pandora. I actually asked the technical director on twitter if that particular area is still in the game, no reply yet so clear downgrade.

I just draw the line at people refusing to acknowledge the clear leap in visual this game has over last gen masterpieces like RDR2 and HFW.


I forgot to reply to your other post about what settings affect the picture quality the most in game mode compared to vivid or even standard.

First things first, turn off HGIG. I dont care what people say, you dont want accurate HDR, you want BRIGHT HDR. Especially on OLEDs like the LGCX where the peak brightness is only 650 nits instead of the 1000 nit brightness of the infamous KS8000 we all bought when the mid gen consoles came out. HGIG off, Dynamic contrast set to high. That will increase brightness of your tv.

Move the Color slider from 50 to 55. 60 if you can handle it. Vivid settings are typically set to 70 which I find too much but if you dont mind eye popping color then go for it.

Keep your color tone to Neutral or Cool. Everyone recommends Warm for games and movies but while it makes sense for movies, games like horizon and ratchet are colorful and muting their colors by putting a mexican filter on them is doing them a disservice. vivid modes on tvs using Cool for a reason.

keep all the motion plus settings off. However, i did play zelda with some of them enabled based on GamingTech's recommendations and it felt way better than a 30 fps game. there is some minor ghosting but for a game where you arent moving the camera around too fast, you wont notice it. But those motion settings are the main ones that introduce the most input lag so keep them off, they dont impact the picture quality.
There are calibration standards for all settings to preserve creator intent. I would never recommend vivid or cool settings at all. Neutral (maybe if warm is inaccurate), but warm is the way to go. I agree with turning HGIG off, it’s accurate, but too dark and you can get similar accuracy by calibrating the HDR option in the PS5 menu or devices menu.
 
Last edited:

Montauk

Member
Are you posting footage of an unreleased, highly anticipated game that has been leaked without spoiler tags in a non-relevant thread?

Which doesn’t look at that visually impressive, to boot? Really?
 
Last edited:

DeaDPo0L84

Member
I also play with motion blur, chromatic abberation and film grain all maxed out. And on my OLED its either always on VIVID mode or Film maker mode (Both calibrated to my own tastes, based on the game) I am a frame warriors worst fucking nightmare.
I've never seen someone's tag make so much sense.
 

Musilla

Member
Are you posting footage of an unreleased, highly anticipated game that has been leaked without spoiler tags in a non-relevant thread?

Which doesn’t look at that visually impressive, to boot? Really?
Nothing relevant is seen but I put it in spoiler, sorry
 

Represent.

Represent(ative) of bad opinions
It’s crazy that this is from world war z devs. Not only are they pushing massive amounts of enemies but they were able to also push insane detail in the world itself that is clearly a massive leap over last Gen games.

It just might be the best looking game of the year.
Yeah, its stunning. Its looking like the type of game I didnt think anyone would actually make in 2023. Im pleasantly surprised. We need a release date.

I've never seen someone's tag make so much sense.
Why do you turn off features the devs leave on by default? Its how the game is supposed to be seen and played. YOU are the weird one.
 
While i think you are nuts for playing on vivid mode, it always bugs me how flat and bland games look on game mode. Like wtf was the point of me spending $2,000 on a top of the live TV when half of its features are turned off with game mode on.
You’ll get used to it. Those modes are actually more accurate.
 

SlimySnake

Flashless at the Golden Globes
Yt compression is to high to judge finer details, sometimes it looks better than horizon, sometimes worse.

This thing is gonna succeed if it manage to be more even than horizon, highly nitpicked trailers are not gonna show if half of the game looks meh.

The moment i'm gonna start to see too many meh characters models, creatures or overall bad textures here and there then the game is gonna fall into crossgen catagory like horizon.

Sorry not sorry.
Eh. There is a lot more going on under the hood than just fancy graphics. I have posted those next gen features they outlined two years ago several times in this thread. They could downgrade the game to oblivion but those gameplay features will remain in the game.

Horizon looks pretty 75% of the time. 80% even. But its flaws are due to its cross gen nature that will be resolved when they release Horizon 3. I can promise you it will have better draw distance, character models, and lighting than Avatar. Simply because they wouldve fixed the LOD pop-in, and lighting issues with the higher power of the PS5. Something Avatar has already done. When Avatar looks bad its because of art design choices, not because the engine couldnt keep up. At least for the vast majority of its underwhelming areas.

What Avatar needs to do next is focus on better character rendering. even in the bullshot in engine trailer they released 2 years ago, the character models stuck out like a sore thumb. So it makes sense that after the downgrade, the character models look even worse. This is where they shouldve probably learned from the insane tech GG apparently used for their character models during cutscenes and dialogue scenes. Apparently its a procedural tool that animates and lights each dialogue scene according to DF. Since it only kicks in during cutscenes, they probably have a bigger GPU budget to throw at the character models. I hope Massive learns from this for Avatar 2. After all, this is technically their first gen title. They will only get better with their second entry just like GG did with HFW. I hope they go back to that first trailer and put back that insane photorealistic lighting in the sequel instead of the cartoonish model they've settled on.

Maybe massive can talk to GG and at least get the character models looking like this for the sequel.

260d9864ef5d39106c8974d4fbd3dc066fabf6e7.gif
1ktb9qwicw8a1.gif
 

SlimySnake

Flashless at the Golden Globes

No need to spoiler this.

Not the prettiest of gifs but lighting looks good. Decent light bounce and volumetrics filling up the trees. Tree asset quality does not look good.

Then again, the forest areas never looked good even in last years direct. Game looks best in barren areas and indoor levels.
 

Lethal01

Member
Eh. There is a lot more going on under the hood than just fancy graphics. I have posted those next gen features they outlined two years ago several times in this thread. They could downgrade the game to oblivion but those gameplay features will remain in the game.

Horizon looks pretty 75% of the time. 80% even. But its flaws are due to its cross gen nature that will be resolved when they release Horizon 3. I can promise you it will have better draw distance, character models, and lighting than Avatar. Simply because they wouldve fixed the LOD pop-in, and lighting issues with the higher power of the PS5. Something Avatar has already done. When Avatar looks bad its because of art design choices, not because the engine couldnt keep up. At least for the vast majority of its underwhelming areas.

What Avatar needs to do next is focus on better character rendering. even in the bullshot in engine trailer they released 2 years ago, the character models stuck out like a sore thumb. So it makes sense that after the downgrade, the character models look even worse. This is where they shouldve probably learned from the insane tech GG apparently used for their character models during cutscenes and dialogue scenes. Apparently its a procedural tool that animates and lights each dialogue scene according to DF. Since it only kicks in during cutscenes, they probably have a bigger GPU budget to throw at the character models. I hope Massive learns from this for Avatar 2. After all, this is technically their first gen title. They will only get better with their second entry just like GG did with HFW. I hope they go back to that first trailer and put back that insane photorealistic lighting in the sequel instead of the cartoonish model they've settled on.

Maybe massive can talk to GG and at least get the character models looking like this for the sequel.

260d9864ef5d39106c8974d4fbd3dc066fabf6e7.gif
1ktb9qwicw8a1.gif

Sure, if the sequel is on PS7.
 

SlimySnake

Flashless at the Golden Globes
Sure, if the sequel is on PS7.
If GG can do this while being held back by PS4 they can get very close to avatar's character models with the PS5. maybe not in terms of facial animations and mocap but actual models? should be doable in cutscenes. The real question is can Massive? Outlaws has better character models than Avatar so they've already improved their tech but can they do a complete overhaul like GG did in HFW?

FL_ct3eWQAsD0w3


horizonforbiddenwest_6wk6y.png
 

Lethal01

Member
If GG can do this while being held back by PS4 they can get very close to avatar's character models with the PS5. maybe not in terms of facial animations and mocap but actual models? should be doable in cutscenes. The real question is can Massive? Outlaws has better character models than Avatar so they've already improved their tech but can they do a complete overhaul like GG did in HFW?

FL_ct3eWQAsD0w3


horizonforbiddenwest_6wk6y.png




4k-avatar2-movie-screencaps.com-64.jpg
4k-avatar2-movie-screencaps.com-553.jpg
4k-avatar2-movie-screencaps.com-10622.jpg

4k-avatar2-movie-screencaps.com-1354.jpg
4k-avatar2-movie-screencaps.com-1270.jpg
4k-avatar2-movie-screencaps.com-981.jpg
4k-avatar2-movie-screencaps.com-13323.jpg


You're right, PS7 pro
 

SimTourist

Member
If GG can do this while being held back by PS4 they can get very close to avatar's character models with the PS5. maybe not in terms of facial animations and mocap but actual models? should be doable in cutscenes. The real question is can Massive? Outlaws has better character models than Avatar so they've already improved their tech but can they do a complete overhaul like GG did in HFW?

FL_ct3eWQAsD0w3


horizonforbiddenwest_6wk6y.png
Bro, you're completely insane if you think those Avatar models can be rendered in realtime in any capacity. The absolute top of the line CGI that had unlimited time and budget to make, most CGI studios struggle to even approach the first movie, let alone the sequel, and you're expecting games to match it? Those models have billions of polygons each, games are 150k-200k at most. It's like that guy who expected Spiderman 2 to look like recent Spiderman movie CGI and the game is not even close to Spiderman 1 CGI from 2002.
 
Last edited:

SimTourist

Member
I think people have a core misunderstanding just how far apart realtime rendering and offline rendering are in terms of processing power required.
Say you take a PS5, in one instance it has to render 30 frames per second and in another it has an hour to render one single frame. 30 x 60 x 60 = 108000, so in one hour it produces 108000 frames versus just one. That's 108 thousand times more power available per frame. This is not a gap that can be closed by normal hardware advancements, especially considering that advancements have slowed down to a crawl these days. The advantage of time to render in CGI is massive. This is not considering that CGI render farms are far more powerful than a cheap home console to begin with and it usually takes many hours if not days to render a frame in CGI, the gap becomes astronomical.
 
Last edited:

Lethal01

Member
I think people have a core misunderstanding just how far apart realtime rendering and offline rendering are in terms of processing power required.
Say you take a PS5, in one instance it has to render 30 frames per second and in another it has an hour to render one single frame. 30 x 60 x 60 = 108000, so in one hour it produces 108000 frames versus just one. That's 108 thousand times more power available per frame. This is not a gap that can be closed by normal hardware advancements, especially considering that advancements have slowed down to a crawl these days. The advantage of time to render in CGI is massive. This is not considering that CGI render farms are far more powerful than a cheap home console to begin with and it usually takes many hours if not days to render a frame in CGI, the gap becomes astronomical.

Nah, I'm eyeballing it right now, the difference between PS5 visuals and avatar is maybe 2x the flops, 4x max.
 
Nah, I'm eyeballing it right now, the difference between PS5 visuals and avatar is maybe 2x the flops, 4x max.
Not even in the same stratosphere, sadly. Who knows what texture resolution avatar 2 is using, probably 16k+. That alone is not feasible for a game that is at most 200gb of data at max. I bet these movies have terabytes of texture data, it will never be feasible. The next holdup is image quality, everything is supersampled probably many many times above native 8k. There is simply no aliasing/blurry/artifact moving image in sight.

I really do hope over the next decade somebody could just develop some insane exotic compression tech to make 1TB games a reality, it’s the only way.
 

Lethal01

Member
Not even in the same stratosphere, sadly. Who knows what texture resolution avatar 2 is using, probably 16k+. That alone is not feasible for a game that is at most 200gb of data at max. I bet these movies have terabytes of texture data, it will never be feasible. The next holdup is image quality, everything is supersampled probably many many times above native 8k. There is simply no aliasing/blurry/artifact moving image in sight.

So you're saying about 10x the teraflops a so? I think we could probably get there if sony releases a pro model next year.
 

SlimySnake

Flashless at the Golden Globes
Bro, you're completely insane if you think those Avatar models can be rendered in realtime in any capacity. The absolute top of the line CGI that had unlimited time and budget to make, most CGI studios struggle to even approach the first movie, let alone the sequel, and you're expecting games to match it? Those models have billions of polygons each, games are 150k-200k at most. It's like that guy who expected Spiderman 2 to look like recent Spiderman movie CGI and the game is not even close to Spiderman 1 CGI from 2002.
This is what realtime rendering can do on a PS5.

lion_cub_tussle_720.gif

8ba82e817e6ee5d2b9658a3b624a20137fcd3c68.gifv


6f3d38f4ac871587ab8312d5c4d0866e0ceacb36.gif


3N3Fy42.jpg


FyW_ibyX0AEPvBD



In realtime cutscenes, we can easily get very close to that when it comes to character models. Actual facial animations and that CG fidelity is not possible and likely wont be ever, but realtime cutscenes have access to way better character models, lighting, hero lighting, and other visual effects due to cutscenes not needing to run game logic and the GPU rendering mostly characters up close. Thats why realtime cutscenes looked like on a 1.8 tflops PS4.

the-last-of-us2-ellie-and-dina.gif


Judging this gen based from character models made by Massive who was never great at character models in the Division and Division is a mistake. Their expertise lie in stunning environments and level of detail. ND, GG, KojiPro, Ninja Theory and other narrative driven studios like Quantic Dream are who we should wait for before saying what is and what isnt possible.

Can Massive get there this gen? Doubtful but GG wasnt great until they unlocked the secret sauce with Horizon 2. Anything is possible, but the tech is definitely there.
 

Flabagast

Member
This is what realtime rendering can do on a PS5.

lion_cub_tussle_720.gif

8ba82e817e6ee5d2b9658a3b624a20137fcd3c68.gifv


6f3d38f4ac871587ab8312d5c4d0866e0ceacb36.gif


3N3Fy42.jpg


FyW_ibyX0AEPvBD



In realtime cutscenes, we can easily get very close to that when it comes to character models. Actual facial animations and that CG fidelity is not possible and likely wont be ever, but realtime cutscenes have access to way better character models, lighting, hero lighting, and other visual effects due to cutscenes not needing to run game logic and the GPU rendering mostly characters up close. Thats why realtime cutscenes looked like on a 1.8 tflops PS4.

the-last-of-us2-ellie-and-dina.gif


Judging this gen based from character models made by Massive who was never great at character models in the Division and Division is a mistake. Their expertise lie in stunning environments and level of detail. ND, GG, KojiPro, Ninja Theory and other narrative driven studios like Quantic Dream are who we should wait for before saying what is and what isnt possible.

Can Massive get there this gen? Doubtful but GG wasnt great until they unlocked the secret sauce with Horizon 2. Anything is possible, but the tech is definitely there.
Man TLOU2 still looks absolutely insane.

Facial animation is the best in the business, as well as movements of hands
 
Accurate HDR is way better than bright HDR whenever you aren't playing in a bright room or got your tv outside. The less autoshit the better, I just go filmmode and sometimes change the depth from 55 to 60 if I'm really feel spicy. I'm not here to make every game look like a cartoon, I got stylized games for that, if the game is meant to be colorful then it will look colorful at the normal settings, no need to boost things, it's like lowering your black level to 10 for playing horror games..
I love a very vivid experience on my OLED. I have dynamic contrast set to high and number wise I have the backlight at max with my colour/contrast/sharpness set to maximum. It makes every single game (realistic or stylised) look absolutely phenomenal to me personally but it especially makes Switch games shine.

SlimySnake SlimySnake

I played Tears of the Kingdom with the above settings AND all the motion options turned on. It felt very close to a 60fps game outside the slight latency to the controls because it’s not the fastest paced game and I got used to it after a few minutes. Switch games in general have controller latency outside the 60fps Nintendo games so meh 😝
 

GymWolf

Gold Member
A question for the pc crowd, what is gonna be the heaviest thing to render in starfield? The spaceship battles? I'm not familiar with that genre so i don't know how much heavy space combat is.
 
Last edited:

hououinkyouma00

Gold Member
Phenomenal looking game. Anyone played it? Didn’t get the best reviews but I’d be interested just because the production values are so insane. Seems like it will come to Game Pass or Sony Game Pass sooner rather than later…
I bought it day 1 because I was desperate for a new horror game and it wasn't good. Looks absolutely amazing, but the gameplay is pretty shit with a terrible melee system.
 

SlimySnake

Flashless at the Golden Globes
A question for the pc crowd, what is gonna be the heaviest thing to render in starfield? The spaceship battles? I'm not familiar with that genre so i don't know how much heavy space combat is.
There is nothing in space so thats the easiest thing to render.

Foliage and vast open worlds will always be the hardest on GPU, but in starfield's case, the settlements will be bottlenecked by the CPU a la Baldur's Gate 3's Act 3 so expect drops there if you skimped out on the CPU.

Check out what cod looked like in space. Infinity Warfare at 60 fps. last gen. Absolutely stunning space battles.
 
Top Bottom