It made sense to me. The 60fps mode itself would have to look much worse, but it wouldn't affect their original "creatively 30fps game."
If it looks worse, it means you can't experience the game very well.It made sense to me. The 60fps mode itself would have to look much worse, but it wouldn't affect their original "creatively 30fps game."
That's a personal opinion. Frame rate and temporal information is more important to some than resolution or texture work.If it looks worse, it means you can't experience the game very well.
Good for these people.That's a personal opinion. Frame rate and temporal information is more important to some than resolution or texture work.
Exactly. That's how it happens in every game with a performance mode.It made sense to me. The 60fps mode itself would have to look much worse, but it wouldn't affect their original "creatively 30fps game."
Every game looks worse in the performance mode. Is there literally any game that looks better in the performance mode than in quality mode? That's what a performance mode means -- it deprioritizes visual fidelity in favor of performance and more frames.If it looks worse, it means you can't experience the game very well.
Bullshit. What I'm not willing to accept is a lack of ambition. It's sad, although perhaps somewhat understandable, to watch hugely talented people making the same damn game over and over, taking minimal risks, doing what they already know how to do.When you expect less, you accept less and then you get less.
It's reddit view point. They think they know more than the devs who make the game.Bullshit. What I'm not willing to accept is a lack of ambition, talented people making the same damn game over and over, taking minimal risks, doing what they already know how to do.
This is the most ambitious I've yet seen this gen. I - and I would think every minimally sane person - are willing to cut these guys plenty of slack.
avin
More of a reason for Bethesda to do lockdown on 30 fps.Every game looks worse in the performance mode. Is there literally any game that looks better in the performance mode than in quality mode? That's what a performance mode means -- it deprioritizes visual fidelity in favor of performance and more frames.
Everybody knows the game will look worse in PF mode. It still doesn't stop the majority of people from playing in PF mode because 2x frames provide a much better experience to most gamers.
And people can always shift back to 30 FPS if they don't like how the game looks. That's what people did in Horizon Forbidden West when the 60 FPS mode had shimmering.
Every game looks worse in the performance mode. Is there literally any game that looks better in the performance mode than in quality mode? That's what a performance mode means -- it deprioritizes visual fidelity in favor of performance and more frames.
Everybody knows the game will look worse in PF mode. It still doesn't stop the majority of people from playing in PF mode because 2x frames provide a much better experience to most gamers.
And people can always shift back to 30 FPS if they don't like how the game looks. That's what people did in Horizon Forbidden West when the 60 FPS mode had shimmering.
There is a reason why they are called bugthesda.This isn't a graphically driven arcadey game though. Todd gave us a clear hint it's not just about graphics, but about game fidelity as well. A Plague Tale Requiem is just a small hint of the sacrifices that BGS would have to expand on in a CPU heavy game, this could include core mechanics in the backend, IDK. But I do know there's no reason to think this is a big conspiracy in any way.
There is a reason why they are called bugthesda.
We have to hope they ironed out all those bugs first.
Well, that's because those games are more GPU bound, that's literally setting the graphical complexity down which scales easy but if the game is CPU bound, there's no way to do that without compromising actual game design.I agree. No arguments there. But that's how every developer does it. And to account for that, they create 2 modes:
What's misleading about this statement by Xbox is the fact that the existence of a 60 FPS mode does not compromise the look and fidelity of the game in 30 FPS mode.
- A 30 FPS mode, where they make the game look like they wanted to, at increased visual fidelity.
- A 60 FPS mode, where they compromise the graphics in order to give players option to play at higher frames if they choose to.
Well, that's because those games are more GPU bound, that's literally setting the graphical complexity down which scales easy but if the game is CPU bound, there's no way to do that without compromising actual game design.
If anything, what current gen games should strive for is pushing CPU with actual gameplay complexities instead of pushing for "moar graphx!!!1!!1".
It's equivalent of this.Seriously, people who don't understand that are so, so, so stupid. Reading the recent Starfield threads gave me a chronic headache.
If it looks worse, it means you can't experience the game very well.
It's Bethesda game. Add that with their history.![]()
Why is that all of a sudden an issue now? It's always been this way.
Can't look any worse than FO4. In fact, I would say it would still look far better. Lighting does not take that much of a hit compared to other things.It's Bethesda game. Add that with their history.
You will know why it's bad.
Hope it drops on GeForce now.Can't look any worse than FO4. In fact, I would say it would still look far better. Lighting does not take that much of a hit compared to other things.
Anyways, the PC tests will tell us all we need to know.
These consoles are duds. There is no corner cutting here. Starfield is simply too much game.Who's blaming the console here? Of course the console can output the 60 FPS. It can even do 120 FPS.
Not working to create a 60 FPS performance mode has nothing to do with the console. It has all to do with the developers cutting corners.
Or Options? like Performance & Fidelity like most games right now.
I have the feelings people think games are created first the 3D models with "effects", and then magically they add behavior on top so graphics are the stepping stone... That's my understanding of why people call XSS a "4TF console", hell no, it's a console with 4TF GPU (which can mean whatever without context on its own).Seriously, people who don't understand that are so, so, so stupid. Reading the recent Starfield threads gave me a chronic headache.
So was Todd Howard lying when he said Starfield could hit 60 fps on the series X?I have the feelings people think games are created first the 3D models with "effects", and then magically they add behavior on top so graphics are the stepping stone... That's my understanding of why people call XSS a "4TF console", hell no, it's a console with 4TF GPU (which can mean whatever without context on its own).
For people to know: Internally a game is just a skeleton, an infinite blank space with invisible boxes colliding and lots of calculations that consume A LOT of CPU/Memory resources the more gameplay, simulations (including AI), etc there are.
The graphics are just a facade, a "costume" devs put on top, they make the "skeleton" game with whatever graphics or even without them and put that "facade" on top to make you all think that the characters and the ones moving, that the environment is actually a jungle, a desert, a building, but those are just a facade, that consumes GPU resources, so these can be scaled without affecting the main game tasks at all... They also consume CPU resources due to some CPU tasks that have to be passed on to the GPU (like the amount and positions of the vertices on screen) but with the new technologies those CPU only tasks should be mitigated or relegated to GPU.
Most 8th generation games are GPU bound so gameplay stagnated but graphics improved. When Pro/X consoles came out, they could scale game performance due to just having more powerful GPUs, but kept the same CPU, that's also why you see many games on those consoles with only 30 fps modes, specially these days, they're probably CPU bottlenecked.
So if the game is CPU bound in current gen machines, there are two things that will happen:
1. Games cannot run on 8th gen machines no matter how "cross gen" or "dated" they look, they simply can't and there's nothing to do but completely redesign those games if devs are asked to port them down. It's not about "how they look", it's about what's happening inside, the invisible stuff.
2. Performance cannot improve by reducing graphical features, since they reduce GPU but keep CPU usage, so basically no difference at all, maybe they can gain back some... 5% performance? Maybe.
It's like having a machine that works with water and oil, one to make fire and the other for electricity. You use more oil than water so when there's no more oil, you can't think of using water to produce more fire, it simply doesn't work like that because the system asks you for oil, not water. You can try decomposing water molecules and get oxygen to get more fire but that's a more complex stuff to do and your system is nor prepared for that task, nor it is the most efficient or economically viable way.
That's basically what being CPU bound or GPU bound in a game means, you're bound to water or to oil, you can be bound to both but each on their own tasks.
In a baren planet? Yes. But in density area, nope.So was Todd Howard lying when he said Starfield could hit 60 fps on the series X?
If the game is CPU bound at 30 fps, yes, he was... Where's the surprise? The guy literally said "Fallout 76 won't have bugs on release" on stage and it was literally the buggiest game in Bethesda history lolSo was Todd Howard lying when he said Starfield could hit 60 fps on the series X?
I think the whole contention is that people want to be able to make the choice in prioritizing what's important to them, weather that be locked 30 with the bells and whistles, or an unlocked, sometimes 60, with whatever that may entail.In a baren planet? Yes. But in density area, nope.
I think I recall seeing the cpu in the series X is a teir higher then the recommend cpu for pc, would that means that starfield would be locked 30 for pc players too?If the game is CPU bound at 30 fps, yes, he was... Where's the surprise? The guy literally said "Fallout 76 won't have bugs on release" on stage and it was literally the buggiest game in Bethesda history lol
There is what people want. Then there is what the devs want in order for their games to work.I think the whole contention is that people want to be able to make the choice in prioritizing what's important to them, weather that be locked 30 with the bells and whistles, or an unlocked, sometimes 60, with whatever that may entail.
As Todd stated in the show, due to how dynamic the game is, they need overhead. So they made the choice to lock it to 30FPS for consistency.
LOL no, PC has way more powerful CPUs even in lower mid range like the 5600X which is on another league even being a generation above and we have way better RAM memories too, but when using a PC that only meets requirements I don't know honestly, PC requirements are based on 60 fps as baseline but we've seen some devs put requirements for "30 fps" on current gen games, I'm curious to see how it will end up.I think the whole contention is that people want to be able to make the choice in prioritizing what's important to them, weather that be locked 30 with the bells and whistles, or an unlocked, sometimes 60, with whatever that may entail.
I think I recall seeing the cpu in the series X is a teir higher then the recommend cpu for pc, would that means that starfield would be locked 30 for pc players too?
I guess the "creative choice/vision graphics" wouldn't be meant for the S.There is no point in arguing anymore.. once the game comes out and is heavily downgraded on the series S and/or low/mid pc settings brings fine 60 fps game play... we will have our answers
It's already been talked about that the game is more CPU bound so just simply lowering the resolution would not = 60fps.It made sense to me. The 60fps mode itself would have to look much worse, but it wouldn't affect their original "creatively 30fps game."
It doesn't have 60 fps.I guess the "creative choice/vision graphics" wouldn't be meant for the S.
![]()
Where did I say 60fps?It doesn't have 60 fps.
You guys are reaching this too much.
Because the game is CPU demanding, not graphics. 2 different things.Where did I say 60fps?
The arguments being made for 60, are that the graphics would be lowered to achieve that optional performance mode sacrificing their "creative choice," yet if the graphics have to take a sacrifice, which resolution is already confirmed so far for the S, then that same creative choice is being sacrificed on the lesser box already.
But... the graphics sill need to be lowered on the S.Because the game is CPU demanding, not graphics. 2 different things.
CPU bottlenecked at 4K is certainly something. PC benchmarks will show what's really happening.It's already been talked about that the game is more CPU bound so just simply lowering the resolution would not = 60fps.
Yeah, but that is not the issue with this game.But... the graphics sill need to be lowered on the S.
Love when people talk they know more than Todd.Creative choice.
![]()
And here comes the reductive appeal to authority posts. Disingenuous.Love when people talk they know more than Todd.
I mean, I am not the one who is looking a gatcha moment here.And here comes the reductive appeal to authority posts. Disingenuous.
3 years people argued for options and praised them, now just because "Todd" and the Brand™ says different, they throw that all out the window on the flip of a switch. Good little Lemmings.
The argument isn't about "knowing more than such and such."
Last gatcha,I mean, I am not the one who is looking a gatcha moment here.
Like I said, Todd knows how his game works, not us forum people.
Good for these modders.Last gatcha,
The modding community knows more how their games work. Had to.
"Good thing for these modders." - Todd Howard
So was Todd Howard lying when he said Starfield could hit 60 fps on the series X?