• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Avowed Runs at 30fps on Xbox Series X and S, Obsidian Confirms

StereoVsn

Member
acd01f5f88149726fdd8fafc7c1943d3.gif


Ok, now I am hungry. Time for fried chicken run! There is a Korean chicken place nearby that’s great!

I am going to eat some wings at cinematic 24 fps for maximum enjoyment.

Korean-Fried-Chicken-Wings-B-2FEAT.jpg
 

StereoVsn

Member
Ass?

N6OrS5z.gif



KUvdHQo.gif



7507cc32f566503f34732614c9c33c93ea4b12a2.gifv


It might not be a stunner like Avatar but it does not look like ass lmao. Outer Worlds looked like ass.
And that’s why they should have offered options to players. Much like Bethesda was able to (mostly) get 60fps mode working , Obsidian should have done the same but not six months post release
 

Topher

Identifies as young
And that’s why they should have offered options to players. Much like Bethesda was able to (mostly) get 60fps mode working , Obsidian should have done the same but not six months post release

Exactly. Obviously with how much this has been debated just here on GAF, it is clear folks have different preferences on this and it is incredibly subjective. What I have "disdain" for in all this is the attitude that 30fps should be the only choice available just because it suits someone's personal preference. On PC, this is simply a matter of adjusting settings. On console, it is doing the same thing and creating a preset mode. Just do not get how having that choice is a bad thing.
 

SlimySnake

Flashless at the Golden Globes
Obviously it looks better than OW but that game never was a looker. There is nothing mindblowing about Avowed graphics, we already had super demanding UE5 games that didn't look very good like Immortals of Aveum. Just using Lumen and Nanite doesn't automatically make game look good.

I get why game is targeted at 30fps with those tech but for me it's not worth it, they could have make some sacrifices to make game run in 60fps and it wouldn't change overall presentation that much, there are many last gen games that don't look much worse than this and they are obviously not using nanite/lumen (and would run 60fps on SX).

Just having working 40fps mode at launch would be something but based on how long it took Redfall and Skyrim to get patches I doubt this game will be "playable" (for me at least) for months. MS loves to fuck up stuff.
thats because immortals was targeting 60 fps with no 30 fps mode because the director/studio head was a 60 fps nutjob who mandated 60 fps on the team from the start. Had they targeted 30 fps, it wouldve looked like avowed at worst and wukong at best.

so basically if you want 60 fps you should expect avowed to look like shit like immortals did.

im all for 40 fps modes, its a fuck up, but 60 fps on consoles has too many sacrifices. You and I both know its not just the GPU at that point, but also a CPU bottleneck and recently from the DF interview with Ninja Theory we found out that nanite has a memory bottleneck on consoles too. We have no open world UE5 game running at 60 fps on consoles. Robocop is linear. black myth needs framegen to convert 40 to 60 fps. Remnant 2 didnt use Lumen, only nanite. And Lords of the Fallen is straight up broken in its 60 fps mode on consoles almost never hitting 60 fps despite turning down a whole slew of settings and resolution that completely rob the game of its look. Avowed is technically the first open world game and its an RPG with different systems that hit cpu and memory differently compared to action adventure open world games like horizon FW and Spiderman 2. Both of which might look better in youtube comparisons but i promise you Avowed will have its own strengths thanks to the lumen and nanite implementation.

Just wait and see it on your screen. You can have GOW running on the other HDMI port and switch band and forth. As good as GOW looks, i can promise you Avowed will be doing several things better. I can already see way better quality assets, light bounce, and sheer density packed in each area.
 
Last edited:

RickMasters

Member
I'm on PC. But 30 is totally fine if they implement smooth frame-pacing and low latency. Both feasible.

The 60 or bust crowd is bit cringe if you're on console. Good luck skipping GTA6 which will be 30 locked for sure.
Good point. While this is a disappointing news, I expect a lot more AAA games will run at 30FPS as this gen rolls on and devs push for complex visuals.


I’m not even sure why people think GTA6 on consoles will run at 60. And I don’t expect a PS5 pro to change if it’s true that it runs essentially just an up locked version of the CPU in the PS5 ( which is the same CPU in the series X, albeit slightly different config, right?).



If you want 60FPS and better on everything I say just build a PC
 

Bernardougf

Member
I'm on PC. But 30 is totally fine if they implement smooth frame-pacing and low latency. Both feasible.

The 60 or bust crowd is bit cringe if you're on console. Good luck skipping GTA6 which will be 30 locked for sure.
Is so cute people in 2024 after 99% of games coming out with 60 fps modes trying to cope with xbox 30 fps games because of "reasons" it was the same bullshit with starfield until the 60 fps modes released and the initial "creative" vision was not important anymore.

And GTA6 ? Well its easy to skip anygame and play later when avaible at 60 fps... self control should be the basic for normal human beings if they want something specific.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
And that’s why they should have offered options to players. Much like Bethesda was able to (mostly) get 60fps mode working , Obsidian should have done the same but not six months post release
Put yourself in their shoes. You spend 5 years creating assets from scratch, you carefully light up each area to look as best as possible placing shadows and light sources to make it look as best as it can look. And then you have some exec come in, tell you to take out half of the shadows, downgrade the asset fidelity, halve the resolution, remove all volumetric effects, remove all rt effects, all because of some backlash from the same audience that handed GOTY awards left and right to 30 fps games just 3 years ago.

I dont think that players should be offered any options. Consoles are not PC. And i dont think we are entitled to 60 fps. The devs are making the game and its their game, their vision, and they get to decide if they want to prioritize fidelity or performance. I dont buy games that prioritize performance because Fuck devs who want to continue making last gen games in 2024. If you guys feel so strongly about it, its your prerogative to not buy these games. Or buy them on PC.

I would never go to Rockstar and ask them to downgrade their vision for GTA6 just because i want to play at 60 fps on a 10 tflops console when i could go out there today and buy a 30 tflops GPU for less than $700 and triple the framerate. Just because avowed doesnt look as good as Rockstar or Naughty Dog's games, doesnt mean i get to demand 60 fps from them. Their artists did the best they could with that engine and their artwork needs to be experienced the way they intended for it.

p.S i turned off the 30 fps mode in callisto on PS5 and my heart sank. it looked like a completely different game. they removed all three RT effects, turned down volumetric effects, reduced environment quality and halved the resolution. No wonder no one thought the game looked great. people played the shit version.
 
Ass?

N6OrS5z.gif



KUvdHQo.gif



7507cc32f566503f34732614c9c33c93ea4b12a2.gifv


It might not be a stunner like Avatar but it does not look like ass lmao. Outer Worlds looked like ass.

Yes, ASS

If you look at actual gameplay

You're getting hoodwinked by panoramic shots just like you did with Starfield, and when you actually got to a planet on Starfield it looked nowhere near the pre-canned shots. It looked terrible barren and bland.
 

Bernardougf

Member
If GTA VI is 30 FPS then I'll probably wait for PC this time.

I refuse to have double standards, and my OLED hates 30FPS.
Waiting to play something is for sure the best way this days ... gta 6 its a game that will leave for a long time... no need to rush and play the pseudo - beta as soons as it launches... in fact any game benefits from 6 to 12 months of delay. Cheaper, better optimization and even more complete with dlc.
 

rodrigolfp

Haptic Gamepads 4 Life
take out half of the shadows, downgrade the asset fidelity, halve the resolution, remove all volumetric effects, remove all rt effects, all because of some backlash from the same audience that handed GOTY awards left and right to 30 fps games just 3 years ago.
The PC version already lets you do all that on graphics settings, so...
 

Bernardougf

Member
I have been gaming on an LG CX OLED since day 1 on PS5. Miles, Ratchet, Horizon, FF16 all run fine on the OLED at 30 fps. Smooth as butter. Some games are trash like Demon Souls and FF7 Rebirth. but thats on the devs. I also went back and played DriveClub on this tv thinking it would be shit but its fine and still the best sense of speed to date despite the 30 fps cap.

Right now im playing Wukong at 30 fps on PC. I had to ensure i locked the framerate from the RTSS app instead of ingame or through nvidia control panel because they both add incorrect framepacing but 30 fps is very smooth with proper frametimes on my OLED.

P.S Horizon FW had some issues with brightness flickers at launch but it was due to some motion blur and sharpness issues and i was able to fix it after messing around with some settings. it was patched later so clearly a
LoL calling 30 fps smooth as butter...thats some next level cope for better graphics at all costs... but nice trying
 
Last edited:

Bernardougf

Member
I can assume that the quality of the game is heavily degraded sometimes and we loose a lot of fidelity to play at 60 fps ... I can see all the shortcomings of lower resolutions and assets but for my taste I preferer the obvious advantages of 60 fps for a better smooth gameplay experience... why is so fucking hard for this graphics whores to assume they preferer seeing pretty shit in a slower framerate ? No.. its always the same bullshit .. 30 is amazing.. you can barely see the difference.. is BUTTER SMOOTH... gtfo
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Yes, ASS

If you look at actual gameplay

You're getting hoodwinked by panoramic shots just like you did with Starfield, and when you actually got to a planet on Starfield it looked nowhere near the pre-canned shots. It looked terrible barren and bland.
nah, starfield blew me away. The lighting and asset quality in that game is stunning. Some randomly generated open worlds do look like shit but i cant expect procedurally generated worlds to look good. The actual levels, ships and interiors designed by the devs looked amazing because the devs decided to use realtime GI and extremely high quality assets all of which have a high cost on GPU.


oTOj0r3.gif



qt0KJyN.gif



MaSL1U8.gif


These are all my playthroughs. Game looks better than this because my capture is only 1440p and me downscaling the gifs adds all those shimmering artifacts and removes half the detail.

Avowed will have its moments where it looks bland. Its an open world game after all. But the tech they are using speaks for itself and will stand out once people finally play the game on their tvs.
 
nah, starfield blew me away. The lighting and asset quality in that game is stunning. Some randomly generated open worlds do look like shit but i cant expect procedurally generated worlds to look good. The actual levels, ships and interiors designed by the devs looked amazing because the devs decided to use realtime GI and extremely high quality assets all of which have a high cost on GPU.


oTOj0r3.gif



qt0KJyN.gif



MaSL1U8.gif


These are all my playthroughs. Game looks better than this because my capture is only 1440p and me downscaling the gifs adds all those shimmering artifacts and removes half the detail.

Avowed will have its moments where it looks bland. Its an open world game after all. But the tech they are using speaks for itself and will stand out once people finally play the game on their tvs.

only thing good looking are the interiors.

everything else is mid as fuck at best
 

SlimySnake

Flashless at the Golden Globes
The PC version already lets you do all that on graphics settings, so...
Yes, and PC gamers spend thousands of dollars on the GPU alone to max out those settings. you were in the black myth thread, everyone there was pissed they werent getting 120 fps with path tracing maxed out. No one wants to play at shit settings. especially not pc gamers. why would console gamers?
LoL calling 30 fps smooth as butter...thats some next level cope for better graphics at all costs... but nice trying
I am the one playing it right now. You want me to capture my playthrough? I was shocked myself at how smooth the camera pans were even with motion blur turned off. Like I said before, most people on PC have no idea how to fix frametime issues and they attribute it to PC games running like shit at 30 fps.

Most gameplay videos you guys see on youtube are captured at 30 fps. TLOU2's e3 2018 demo was revered by everyone as the best animated and smooth combat experience we had seen at the time. It honestly still is. 30 fps can definitely feel smooth if your frametimes are rock solid. it's how we all played video games up until 3 years ago. our eyes have not magically evolve in 3 years to reject 30 fps.
 

rodrigolfp

Haptic Gamepads 4 Life
No one wants to play at shit settings. especially not pc gamers. why would console gamers?
Some play, like some Steam Deck players and GTX 1660 owners. Few console gamers care about graphics, they will accept whatever devs delivers, like Witcher 3 on Switch. Even more as an option.
 
Last edited:

Killer8

Gold Member
LoL calling 30 fps smooth as butter...thats some next level cope for better graphics at all costs... but nice trying

It is smooth as butter if the alternative experience on PC is excessive shader compilation and traversal stutter. Recently the Final Fantasy 16 demo was released on Steam and despite being able to play at 100fps on my my 4070 Super using DLAA and frame gen, the game stutters every few steps when moving through the game world. That was simply not present in the PS5 version. I switched back and forth between them and found the locked 30fps on PS5 to be the least bad option. Some people value a consistency of the experience which PC fails time and time again to deliver this generation.
 

rodrigolfp

Haptic Gamepads 4 Life
It is smooth as butter if the alternative experience on PC is excessive shader compilation and traversal stutter. Recently the Final Fantasy 16 demo was released on Steam and despite being able to play at 100fps on my my 4070 Super using DLAA and frame gen, the game stutters every few steps when moving through the game world. That was simply not present in the PS5 version. I switched back and forth between them and found the locked 30fps on PS5 to be the least bad option. Some people value a consistency of the experience which PC fails time and time again to deliver this generation.
Not here. On both desktop and Steam Deck. 🤷‍♂️
 

SlimySnake

Flashless at the Golden Globes
30 is amazing.. you can barely see the difference.. is BUTTER SMOOTH... gtfo
i never said 30 is amazing or that you can barely see the difference. Of course, 60 fps is always better. if i didnt believe that i wouldnt have spent 6 months paychecks making $5.85 an hour 20 years ago on a PC. Or thousands of dollars every gen after that.

But Black Myth IS butter smooth at 30 fps just like most Sony games last gen. Not on consoles where it is uncapped, but on my PC using proper tools. Was bloodborne butter smooth despite being a locked 30 fps? NO. Was Uncharted 4? yes. Does Uncharted 4 on PS5 at 60 fps play better than Uncharted 4 at 30 fps on PS4. yes.

If I can admit this why cant you admit that 30 fps on an OLED with a flat frametime graph would appear smooth? All i have gotten so far are gif replies, people saying im coping, or saying the game looks awful, trash, shit etc etc. despite being shown gifs showing otherwise. I dont think im the one coping. I am not the one with the agenda for 30 fps, it's the people who know why it's 30 fps and choose to ignore the reasoning behind it because they have a 60 fps agenda they want to shove down everyone's throats instead of ponying up for a PC like the rest of us. You can actually build a 500 dollar potato PC and run games at series s quality settings at 60 fps if you are so price conscious.
 
Last edited:

Topher

Identifies as young
Yes, and PC gamers spend thousands of dollars on the GPU alone to max out those settings. you were in the black myth thread, everyone there was pissed they werent getting 120 fps with path tracing maxed out. No one wants to play at shit settings. especially not pc gamers. why would console gamers?

You are missing the point. Developers are already allowing their games to be played at lower settings for higher frame rates. A mode is just a preset collection of settings that are in the engine. Only difference on console is you get a couple of choices instead of many. So this....

"And then you have some exec come in, tell you to take out half of the shadows, downgrade the asset fidelity, halve the resolution, remove all volumetric effects, remove all rt effects"

...are already options in the game whether you like it or not. So ultimately what you arguing for is simply the removal of a toggle button. That's really it and there is no good reason for it.
 

FireFly

Member
I can assume that the quality of the game is heavily degraded sometimes and we loose a lot of fidelity to play at 60 fps ... I can see all the shortcomings of lower resolutions and assets but for my taste I preferer the obvious advantages of 60 fps for a better smooth gameplay experience... why is so fucking hard for this graphics whores to assume they preferer seeing pretty shit in a slower framerate ? No.. its always the same bullshit .. 30 is amazing.. you can barely see the difference.. is BUTTER SMOOTH... gtfo
A recent study found substantial differences between individuals in the temporal resolution they were able to perceive the world at. I expect that more responsive your visual system is, the worse 30 FPS will feel.
 

Topher

Identifies as young
I am not the one with the agenda for 30 fps, it's the people who know why it's 30 fps and choose to ignore the reasoning behind it because they have a 60 fps agenda they want to shove down everyone's throats instead of ponying up for a PC like the rest of us. You can actually build a 500 dollar potato PC and run games at series s quality settings at 60 fps if you are so price conscious.

Can't speak for others, but I'm all in favor of you having a 30fps option. But honestly, as far as I can tell the only person who is wanting to take away options from others here is you.

Awkward John Krasinski GIF by Saturday Night Live
 
Last edited:

Bernardougf

Member
A recent study found substantial differences between individuals in the temporal resolution they were able to perceive the world at. I expect that more responsive your visual system is, the worse 30 FPS will feel.
Yeah it may well be.. but I always roll my eyes to individuals that can see the minimal pixel difference, or fine details in light or grass but at the same time say they cannot see difference between 30 or 60 fps... its just ridiculously bullshit since all of this is eye detail perception.... can we at least agree on this ?
 
Last edited:

Topher

Identifies as young
Yeah it may well be.. but I always roll my eyes to individuals that can see the minimal pixel difference, or fine details in light or grass but at the same time say they cannot see difference between 30 or 60 fps... its just ridiculously bullshit since all of this is eye detail perception.... can we at least agree on this ?

Hard to say. I can't see what others see. I mean.....I agree with you. I hate 30fps, but some folks do fine with it. Doesn't make sense to me, but that's why I think we need options.
 

Bernardougf

Member
You are missing the point. Developers are already allowing their games to be played at lower settings for higher frame rates. A mode is just a preset collection of settings that are in the engine. Only difference on console is you get a couple of choices instead of many. So this....

"And then you have some exec come in, tell you to take out half of the shadows, downgrade the asset fidelity, halve the resolution, remove all volumetric effects, remove all rt effects"

...are already options in the game whether you like it or not. So ultimately what you arguing for is simply the removal of a toggle button. That's really it and there is no good reason for it.
Mate its the same ridiculous argument at the starfield topic... creative vision.. devs choices... and them the same fucking game comes out on PC . With the usual multiple settings that allow their vision to be dragraded to oblivion ... this apologists are helpless without new arguments.
 

Bernardougf

Member
Hard to say. I can't see what others see. I mean.....I agree with you. I hate 30fps, but some folks do fine with it. Doesn't make sense to me, but that's why I think we need options.
I just think the same eyes that catch minimal graphics details should easily see the difference between 30 to 60fps since its a graphical feature... I understand someone dont seen both ...but one or another? ..nah... they are integrated.. and yes.. options are great. Always.
 
Last edited:

FireFly

Member
Yeah it may well be.. but I always roll my eyes to individuals that can see the minimal pixel difference, or fine details in light or grass but at the same time say they cannot see difference between 30 or 60 fps... its just ridiculously bullshit since all fo this is eye detail perception.... can we at least agree on this ?
Well as I see it the spatial resolution is the amount of detail your eyes and visual system can resolve at a given moment, while the temporal resolution is the frequency at which your brain can update that information. So you might have bad eyesight but your brain can "update" 120 times a second. Or you have great eyesight but your brain updates at half the speed.

Personally I can "feel" the responsiveness difference between 30 and 60 FPS (eg. playing a multiplayer game) but my brain struggles to perceive the extra visual smoothness.
 
Last edited:
nah, starfield blew me away. The lighting and asset quality in that game is stunning. Some randomly generated open worlds do look like shit but i cant expect procedurally generated worlds to look good. The actual levels, ships and interiors designed by the devs looked amazing because the devs decided to use realtime GI and extremely high quality assets all of which have a high cost on GPU.


oTOj0r3.gif



qt0KJyN.gif



MaSL1U8.gif


These are all my playthroughs. Game looks better than this because my capture is only 1440p and me downscaling the gifs adds all those shimmering artifacts and removes half the detail.

Avowed will have its moments where it looks bland. Its an open world game after all. But the tech they are using speaks for itself and will stand out once people finally play the game on their tvs.
Yeah starfield actually had pretty great graphics, uneven but far better than most. Avowed however I’m not seeing it tbh


I do agree on the 30 fps front tho. Last of us 2 was 30fps on ps4. Imagine if they tried to get all those games to 60 to release on that console, it would ruin the entire look. No one complained. It takes 15 minutes tops to become reacquainted and used to 30 fps (yes, on oled I have a C1). People saying it’s “unplayable” must’ve skipped 3 generations of consoles.
 
Last edited:

Topher

Identifies as young
Well as I see it the spatial resolution is the amount of detail your eyes and visual system can resolve at a given moment, while the temporal resolution is the frequency at which your brain can update that information. So you might have bad eyesight but your brain can "update" 120 times a second. Or you have great eyesight but your brain updates at half the speed.

Personally I can "feel" the responsiveness difference between 30 and 60 FPS (eg. playing a multiplayer game) but my brain struggles to perceive the extra visual smoothness.

This makes sense to me. I make all kinds of adjustments on PC to get the frame rate up and I always have a hard time seeing the difference in visual quality. But I notice a low frame rate immediately. Actually, for me, at higher frame rate the visual quality always looks better than higher resolutions. I'm guessing I'm able to process more frames better than I'm able to see a single frames visual quality. If that makes sense.
 

SlimySnake

Flashless at the Golden Globes
You are missing the point. Developers are already allowing their games to be played at lower settings for higher frame rates. A mode is just a preset collection of settings that are in the engine. Only difference on console is you get a couple of choices instead of many. So this....

"And then you have some exec come in, tell you to take out half of the shadows, downgrade the asset fidelity, halve the resolution, remove all volumetric effects, remove all rt effects"

...are already options in the game whether you like it or not. So ultimately what you arguing for is simply the removal of a toggle button. That's really it and there is no good reason for it.
i suppose if its allowed on PC, it should be allowed on consoles. But like with starfield, we dont know what the bottlenecks are. We do know with starfield, it was the CPU which was the bottleneck and simply reducing the graphics settings wouldnt do until they spent months optimizing the CPU performance and even then werent able to get the cities to run at 60 fps. Even on PC, The game in the cities ran like shit because of the CPU bottlenecks and PC gamers rioted over being told to upgrade their systems.

So maybe its just like the Starfield situation and its not just a GPU but a CPU bottleneck. We need to stop judging these RPGs differently from action adventure games that are very low on CPUs. I was telling people that my CPU was being hammered at 70-80% when no other game ever comes to even 40% but they kept saying it was unoptimized. Nah, it was just heavy. Even after dozens patches, the cpu performance only improved by 15-20% at best.

When Kingdom Come 2 comes out, we will do this whole song and dance again. Instead of just celebrating what the game is doing visually and giving devs credit for actually putting in effort and incorporating next gen tech. We deserve cross gen.
 

Ashamam

Member
Smooth as butter.
This is an interesting comment and goes to perception and how peoples brains process visual data.

You are not wrong in your own frame of reference but you are completely wrong in many other peoples.

It is no exaggeration for me to say NO 30fps image on an Oled is smooth as butter in motion, especially panning across the wider horizontal axis. It is ALWAYS jarring as f*ck and takes a few minutes for the experience to smooth out in my head a bit. But even then its never smooth, just familiar.

Its just as noticeable if for some reason I watch a TV show accidentally in Game mode. The instant the shot pans I'm reaching for the remote with the rest of the family laughing at me. My wife just shakes her head as she doesn't see it, but the kids do (but it doesn't bother them as much). So its all about the individual.
 

MikeM

Gold Member
This kind of shit is why i’m glad I dumped console as my main for PC. Some of these devs/execs really prioritize the wrong things.
I hope day 1 sales suck on console to send a message.
 
Last edited:
This is an interesting comment and goes to perception and how peoples brains process visual data.

You are not wrong in your own frame of reference but you are completely wrong in many other peoples.

It is no exaggeration for me to say NO 30fps image on an Oled is smooth as butter in motion, especially panning across the wider horizontal axis. It is ALWAYS jarring as f*ck and takes a few minutes for the experience to smooth out in my head a bit. But even then its never smooth, just familiar.

Its just as noticeable if for some reason I watch a TV show accidentally in Game mode. The instant the shot pans I'm reaching for the remote with the rest of the family laughing at me. My wife just shakes her head as she doesn't see it, but the kids do (but it doesn't bother them as much). So its all about the individual.
It’s what you’ve trained yourself to see after playing 60fps games. You could see 30fps as smooth if you never had played 60fps. Just like I’m sure people used to 120fps can see the difference with 60. We spent decades playing games at 30 and very few people actually gave a fuck tho
 
Last edited:

Topher

Identifies as young
i suppose if its allowed on PC, it should be allowed on consoles. But like with starfield, we dont know what the bottlenecks are. We do know with starfield, it was the CPU which was the bottleneck and simply reducing the graphics settings wouldnt do until they spent months optimizing the CPU performance and even then werent able to get the cities to run at 60 fps. Even on PC, The game in the cities ran like shit because of the CPU bottlenecks and PC gamers rioted over being told to upgrade their systems.

So maybe its just like the Starfield situation and its not just a GPU but a CPU bottleneck. We need to stop judging these RPGs differently from action adventure games that are very low on CPUs. I was telling people that my CPU was being hammered at 70-80% when no other game ever comes to even 40% but they kept saying it was unoptimized. Nah, it was just heavy. Even after dozens patches, the cpu performance only improved by 15-20% at best.

When Kingdom Come 2 comes out, we will do this whole song and dance again. Instead of just celebrating what the game is doing visually and giving devs credit for actually putting in effort and incorporating next gen tech. We deserve cross gen.

How was it not optimization when optimization is what allowed them to implement higher frame rates? And performance mode doesn't have to be 60fps. Even 40fps will give a noticeably smoother gameplay. But honestly, I still don't know what was so CPU intensive about Starfield. Perhaps all those generated NPCs wandering around like zombies in the cities. But once again....that's a setting. Starfield on PC has the ability to reduce crowd size and it helped in my gameplay. Considering how horrible they all looked....that was a blessing. Fact of the matter is that Starfield should have been delayed. It just wasn't ready at launch.
 
Top Bottom