Can we please stop with the whole "60 fps is not cinematic" argument.

the whole chain of thought came from apologists and game websites with a bias for Microsoft, for the xbone version of tomb raider.
 
Great post.

Plus, there are actuallu 'many' games that are NOT SHOOTERS and doesnt need for that extra responsiveness. For the record.
What's "needed" is arguable and it's subject to individual standards.
And yet there isn't a single genre, not even turn based games, that makes having 60 fps detrimental to the experience and less appealing visually, so I don't see the value of your argument.
 
Yes they were. They were designed with 30fps and console limitations in mind. I dunno how anyone actually can refute this. They werent designed with high spec systems in mind and then downgraded to consoles. If these games were not 30fps and instead 60, these games would look worse than they did at 30fps, on console or pc, period. They aren't gonna add extra textures, extra polygons and better character models, bigger geometry, smarter AI and what other improvements you got on the pc port. All of the pc improvements of these games come down to: higher resolution, higher fps, better and sharper details of the details that already exists in some form in the low end (consoles).

I played Spec Ops the line on the highest settings on pc, it doesn't look next gen, it looks like a PS360 game, same engine, except that it ran in 1440p on my monitor.

I'm not refuting that at all. You quoted me seeing these games *weren't* designed for 60 fps.

They still all play better that way. The trade offs you make for 60 fps aren't in playability. I'm really not sure where such a suggestion comes from. This generation we're going to start seeing third person games running at 60 fps in consoles more often I'd guess, because the visual concessions to hit 60 are no longer what they were and more developers are going to think it's worth making those concessions for the sake of playability.

I haven't heard The Evil Within's framerate confirmed, but it's more than likely 60 given it's engine. Plants vs Zombies Garden Warfare was 60. The Last of Us HD will be 60. Tomb Raider wasn't there, but it was close.

You're likely talking about developers of two of the most cinematic third person games of the last two generations both choosing 60 fps for a cinematic project in the here and now.

Presuming that it's 60 fps (which obviously it might not be) Has The Enemy Within done away with RE4's style of enemy and combat? Is The Last of Us going to turn into an inferior product for running at 60 fps?

No and no. Again, presuming Evil Within is 60.

30 fps is a concession made to achieve other goals. I'm not sure that there is really a game developer out there that chooses 30 for the sake of 30.
 
What's "needed" is arguable and it's subject to individual standards.
And yet there isn't a single genre, not even turn based games, that makes having 60 fps detrimental to the experience and less appealing visually, so I don't see the value of your argument.

Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
 
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.

That is not how this works at all. "Visual appeal" does not scale linearly with rendering time.
 
It's like old b&w blurry pictures vs 40MP digital shots.

Sometimes they have a charm of their own and you don't need perfect fidelity.

I'm glad devs have an option. While 60fps always plays better, how much better depends on the way the game was designed, how fast it moves, how much precision is required.

@plagiarize. Nope those games run at 60 fps because they are either last Gen ports or cross Gen games for which devs used the spare power to double the frame rate because its an improvement that didnt require building new assets or any major rework on the graphics. Next Gen Uncharted and Tlou will likely go back to 30.
 
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
Yeah, I definitely call bullshit on this.
For a start, because you are arbitrarily deciding that increased detail, more complex geometry, etc are implicitly better than smoothness to make something visually appealing, which isn't a fact of life but a subjective opinion.
On top of that, it also happens to be the case where for many games you actually have a option which happens to allow both (using more powerful hardware).
 
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.

Thats ignoring the natural visual appeal of smooth gameplay which looks better than any resolution bump or particle effects
 
Were those games 60fps when they first came out? I enjoyed them on consoles and never missed those extra 30 frames, I beat 30fps multiple times and they offered me a lot of playability, overall in my gaming life I've probably completed 10x as many 30fps games as 60.

Is this some stealth pc masta raze!!! comment? How can you take the best of everything when sacrifices have to be made between IQ and fps?


"Hey guyz, let's just never change anything"-the-post.
 
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.

Not necessarily. It could end up looking even better - if you are planning for 60FPS from the very beginning. You willst use a different artstyle etc. to make up for the lack of graphical fidelity. Sure, in most cases it will look less spectacular, but certainly not "50%" worse. Also, since it's videoGAMES, we wanna play it, don't we? So why not prioritize that the game aspect first?
 
I'm not refuting that at all. You quoted me seeing these games *weren't* designed for 60 fps.

They still all play better that way. The trade offs you make for 60 fps aren't in playability. I'm really not sure where such a suggestion comes from. This generation we're going to start seeing third person games running at 60 fps in consoles more often I'd guess, because the visual concessions to hit 60 are no longer what they were and more developers are going to think it's worth making those concessions for the sake of playability.

I haven't heard The Evil Within's framerate confirmed, but it's more than likely 60 given it's engine. Plants vs Zombies Garden Warfare was 60. The Last of Us HD will be 60. Tomb Raider wasn't there, but it was close.

You're likely talking about developers of two of the most cinematic third person games of the last two generations both choosing 60 fps for a cinematic project in the here and now.

Presuming that it's 60 fps (which obviously it might not be) Has The Enemy Within done away with RE4's style of enemy and combat? Is The Last of Us going to turn into an inferior product for running at 60 fps?

No and no. Again, presuming Evil Within is 60.

30 fps is a concession made to achieve other goals. I'm not sure that there is really a game developer out there that chooses 30 for the sake of 30.

On limited hardware, yes, yes there are plenty who deliberately shoot for 30 frames. The Evil Within is 60 frames because the developers chose that 60 frames best met their goals for what they wanted the game to be. Consoles are fixed hardware, which means you will always have to make compromises and decisions that you feel best suit your game. 60fps brings it's own compromises, just like 30 fps brings it's own set of compromises. It's all about which compromises you're willing to sacrifice and those which you're not. The Evil Within being 60 frames, doesn't refute that Mikami may want his next to be 30 frames, if it means 30 frames allows him to meet his goals for what he wants his next game to be.

I believe you're coming at this argument as "all things equal, 60 frames is superior" and yes I would agree with you, but all things aren't equal. Sacrifices have to be made to achieve 60 frames, and sacrifices also have to be made if you're shooting for 30. It's not a "one size fits all" argument and there is no definitive answer. It depends on what game you're making for the specific hardware. TLOU being 60 frames on ps4, doesn't negate that the TLOU is a better game on ps3 hardware when designed around 30 frames.
 
"Hey guyz, let's just never change anything"-the-post.

To be fair to him in one regard, I do understand the argument that framerate CAN be a choice, just as we see people purposefully using shitty video quality in films or black and white, or what have you... to evoke a sense of time, or to produce the feel of a certain type of entertainment (Rocky Balboa for example switches from film to video for it's boxing match finale)...

But most everything moves on, because the vast majority of black and white films weren't saying anything by being black and white and just wanted to evoke reality as much as possible.

The same is true with framerate. Right now we're at the point in time where shooting and projecting high frame rate is expensive, and your choices for going to see it are limited. I know there are a group of people that hate 3D, but it's standard now for certain types of releases, and audiences are holding steady. The expectation is there now that next years super hero movies will be in 3D, but it took decades of sporadic 3D hits and failures before it reached the point where audiences expect it.

But in time, audiences are going to expect visual fidelity and framerate, to get better. And as they get used to it, their minimum standards are going to change.

Just the same way we'd shit on a game made for PS4 that looked and ran like Resident Evil 4 today. And we'd be right to do it. If the latest big budget movie came out and it randomly looked like Technicolor we'd all criticize it.
 
People use Hobbit as an example of higher frame rates being a bad thing, NOT because of the higher frame rate itself, but because it makes it easier to notice things that go unnoticed in slower frame rates. You can notice the make-up, the sets and everything and it breaks the illusion. Again: NOT because higher frame rates are a bad thing. They are not, they give you more information, otherwise you wouldn't be able to notice this things.

What should happen in this case, is that studios and directors should adapt, or change the way they make these things or shoot their films, so you can't notice them in higher frame rates, to keep the illusion going. Than people would be able to appreciate higher frame rates, because it makes scenes and everything going on on the screen so much clearer, specially in fast action scenes or when there is fast camera movement.
 
On limited hardware, yes, yes there are plenty who deliberately shoot for 30 frames. The Evil Within is 60 frames because the developers chose that 60 frames best met their goals for what they wanted the game to be. Consoles are fixed hardware, which means you will always have to make compromises and decisions that you feel best suit your game. 60fps brings it's own compromises, just like 30 fps brings it's own set of compromises. It's all about which compromises you're willing to sacrifice and those which you're not. The Evil Within being 60 frames, doesn't refute that Mikami may want his next to be 30 frames, if it means 30 frames allows him to meet his goals for what he wants his next game to be.

I believe you're coming at this argument as "all things equal, 60 frames is superior" and yes I would agree with you, but all things aren't equal. Sacrifices have to be made to achieve 60 frames, and sacrifices also have to be made if you're shooting for 30. It's not a "one size fits all" argument and there is no definitive answer. It depends on what game you're making for the specific hardware. TLOU being 60 frames on ps4, doesn't negate that the TLOU is a better game on ps3 hardware when designed around 30 frames.
Reading comprehension please. Developers choose 30 fps for the sake of other things. No developer is choosing 30 fps because they want 30 fps. They're choosing it because they don't want to concede other things.

Developers that choose 60, do so because they want 60.
 
To be fair to him in one regard, I do understand the argument that framerate CAN be a choice, just as we see people purposefully using shitty video quality in films or black and white, or what have you... to evoke a sense of time, or to produce the feel of a certain type of entertainment (Rocky Balboa for example switches from film to video for it's boxing match finale)...

But most everything moves on, because the vast majority of black and white films weren't saying anything by being black and white and just wanted to evoke reality as much as possible.

The same is true with framerate. Right now we're at the point in time where shooting and projecting high frame rate is expensive, and your choices for going to see it are limited. I know there are a group of people that hate 3D, but it's standard now for certain types of releases, and audiences are holding steady. The expectation is there now that next years super hero movies will be in 3D, but it took decades of sporadic 3D hits and failures before it reached the point where audiences expect it.

But in time, audiences are going to expect visual fidelity and framerate, to get better. And as they get used to it, their minimum standards are going to change.

Just the same way we'd shit on a game made for PS4 that looked and ran like Resident Evil 4 today. And we'd be right to do it. If the latest big budget movie came out and it randomly looked like Technicolor we'd all criticize it.

Funny you mention 3d movies. They are horrible blurry at low framerates, almost unwatchable.

It's time for change and that we are frequently discussing on that matter here at GAF makes me want to believe that at least "something" is moving forward. Hopefully we won't see any triple A games with shitty framerates anymore. Think about GTA V and TLOU, arguably the biggest/most important titles last gen, and both had a super shitty, almost 24fps like framerate Seriously, where is that supposed to lead? Where is the "stop"? Every dev says "30FPS" but they are consistently going below that just for better graphics and people STILL buy that. The worst part of that: I bought TLOU on Day 1 as a ND "fan" and a fan of the setting and I DID NOT KNOW that the framerate would be THAT shitty albeit I READ a ton of reviews!!! This industry is somewhat fucked with their priorities and if it's loosing steam then for the right reasons, which is consistently putting VIDEO over GAME. That's not how it works, I believe. Temporarily, for a certain crowd, maybe. But it's not a valuable long term strategy to weaken your own strong points, which is interaction/being interactive.
 
People use Hobbit as an example of higher frame rates being a bad thing, NOT because of the higher frame rate itself, but because it makes it easier to notice things that go unnoticed in slower frame rates. You can notice the make-up, the sets and everything and it breaks the illusion. Again: NOT because higher frame rates are a bad thing. They are not, they give you more information, otherwise you wouldn't be able to notice this things.

What should happen in this case, is that studios and directors should adapt, or change the way they make these things or shoot their films, so you can't notice them in higher frame rates, to keep the illusion going. Than people would be able to appreciate higher frame rates, because it makes scenes and everything going on on the screen so much clearer, specially in fast action scenes or when there is fast camera movement.

It's not universally true of everything in the film either though. The CG looks more believable at high frame rates to me. During the riddles in the dark sequence Gollum is far more convincing at 48 fps. Furthermore everything that's real looks more real. It's not just concessions.

It's a learning experience. I think it was unfair to expect the first big theatrical HFR film to get everything right.

And both of The Hobbit movies were still cinematic when exhibited at HFR. You could argue that they were less cinematic, perhaps, but I've never heard anyone saying they weren't cinematic. That certain things looked more obviously fake? Sure, but uncinematic? Crazy.
 
I'm really starting to tire of both extreme sides to this well-trodden debate. 30fps looks more filmic, period. If that is something that the developers feel is important to capture, then using 30fps can be a stylistic choice. It also allows them to pump out more detail and effects given the extra rendering headroom.

60fps is objectively better with regard to motion clarity and responsiveness, there's no denying that. But it's not like 30fps is an unplayable mess if it's locked and stable.

Extreme opinions on this tired subject can go fly a kite. Both framerates have their stylistic advantages, and the developers are free to choose whichever they'd like to convey the look they're going for.

Now sub-30fps drops is another story altogether. That just shouldn't happen.
 
Yes, they shoot for 30fps, but its always for a reason; not because "30fps is more fitting for this game because it is."
At best it's not the only reason, even with way, way more powerful hardware I suspect Ready at Dawn or Crytek would still go "ok, so what can we pull off while still being 30 fps, roughly at least?" If you could somehow create hardware that magically had no limits or more realistically had limits but could not be reasonably attained, they might still force 30 fps but if they had sense would make it an option so that "the feel" could be there for those that want it while those that DON'T can have the game unrestrained.
 
To me higher the framerate the better. I can notice tearing and dips when it goes below 60fps or even 30fps on consoles. It's not only annoying it is displeasure when your playing a game and tearing/stuttering all come in play.
 
Funny you mention 3d movies. They are horrible blurry at low framerates, almost unwatchable.

It's time for change and that we are frequently discussing on that matter here at GAF makes me want to believe that at least "something" is moving forward. Hopefully we won't see any triple A games with shitty framerates anymore. Think about GTA V and TLOU, arguably the biggest/most important titles last gen, and both had a super shitty, almost 24fps like framerate Seriously, where is that supposed to lead? Where is the "stop"? Every dev says "30FPS" but they are consistently going below that just for better graphics and people STILL buy that. The worst part of that: I bought TLOU on Day 1 as a ND "fan" and a fan of the setting and I DID NOT KNOW that the framerate would be THAT shitty albeit I READ a ton of reviews!!! This industry is somewhat fucked with their priorities and if it's loosing steam then for the right reasons, which is consistently putting VIDEO over GAME. That's not how it works, I believe. Temporarily, for a certain crowd, maybe. But it's not a valuable long term strategy to weaken your own strong points, which is interaction/being interactive.

I'm well aware. I haven't yet bought GTA5 or The Last of Us because I struggle to enjoy games that run at those framerates. People try to make out like it's because I care about graphics over gameplay, when it's actually totally the opposite.

I'm eagerly awaiting Last of Us PS4 and I'm very optimistic for a GTA5 on PS4 or PC so I can play those games without too many concessions made towards graphics over gameplay.
 
It's not universally true of everything in the film either though. The CG looks more believable at high frame rates to me. During the riddles in the dark sequence Gollum is far more convincing at 48 fps. Furthermore everything that's real looks more real. It's not just concessions.

It's a learning experience. I think it was unfair to expect the first big theatrical HFR film to get everything right.

And both of The Hobbit movies were still cinematic when exhibited at HFR. You could argue that they were less cinematic, perhaps, but I've never heard anyone saying they weren't cinematic. That certain things looked more obviously fake? Sure, but uncinematic? Crazy.

Absolutely true. Forgot to mention, but yes, during CGs it is incredibly better, and even more believable. Animations films should all be released in 48 fps from now on. Makes a lot of difference.
 
I'm really starting to tire of both extreme sides to this well-trodden debate. 30fps looks more filmic, period. If that is something that the developers feel is important to capture, then using 30fps can be a stylistic choice. It also allows them to pump out more detail and effects given the extra rendering headroom.

60fps is objectively better with regard to motion clarity and responsiveness, there's no denying that. But it's not like 30fps is an unplayable mess if it's locked and stable.

Extreme opinions on this tired subject can go fly a kite. Both framerates have their stylistic advantages, and the developers are free to choose whichever they'd like to convey the look they're going for.

Now sub-30fps drops is another story altogether. That just shouldn't happen.

Look, if we're going with the notion that 24 fps looks filmic *because films* which I think personally is absurd obviously, then I don't think we can go with the whole 30 fps is *more* filmic.

Are sitcoms and soap operas more cinematic or filmic than Tomb Raider PS4? Is Tomb Raider PS4 less cinematic than Tomb Raider Xbox One?

I mean, if you want to make that argument, go ahead... but I'd say it's absurd.
 
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.
 
Look, if we're going with the notion that 24 fps looks filmic *because films* which I think personally is absurd obviously,
What strikes me about this argument is that even conceding "24-to-30fps look more filmic", I'm genuinely not sure why "looking more filmic" is something we should praise and embrace.
Fuck filmic, then?
 
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.

I think you are being sarcastic but: Don't worry, you got a lot of brothers here on GAF.
 
Reading comprehension please. Developers choose 30 fps for the sake of other things. No developer is choosing 30 fps because they want 30 fps. They're choosing it because they don't want to concede other things.

Developers that choose 60, do so because they want 60.

Again, if all things were equal I would agree with you, but if a developer feels their game is better at 30 frames, than don't they want it 30 frames? By your logic no developer chooses 60 frames, if they could just have 120, they choose 60 frames to concede to other things.

Yes, they shoot for 30fps, but its always for a reason; not because "30fps is more fitting for this game because it is."

on specific hardware that's absolutely the case. TLOU would be a fundamentally different game were it 60 frames on ps3. TLOU at 60 frames on ps4 is still fundamentally the TLOU we know and love, just at 60 frames thanks to the improved hardware. Just like TLOU 2 would be a fundamentally different game were it designed around 60 frames as opposed to 30 on ps4 hardware. Just because a developer has reasons for making their game 30 frames, doesn't refute that a game being 30 frames isn't a deliberate choice made by the developer for what they see makes a better game given the hardware they are working on.

I'm still unsure on what your guys's arguments is. If it's "60 fps is not less cinematic than 30 frames" than I agree. Given unlimited hardware I'd choose 60 frames 100% of the time. But saying "no developer wants their game to be 30 frames on select hardware" completely neglects the realities of game development.
 
Absolutely true. Forgot to mention, but yes, during CGs it is incredibly better, and even more believable. Animations films should all be released in 48 fps from now on. Makes a lot of difference.
It reallly does. Makes 24 look really bad in comparison. Even more so then live action stuff. The exaggerations in animation and whatnot are given such a boost.
 
I just wanted to chime in to say that "filmic" is the single dumbest thing I've ever read that tried to rationalize low-tech, poor-performance graphics.
 
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.

Oh no, I'm serious. Feels like I'm controlling a monkey who just broke out of the mountain dew nuka-cola concentrate vault.
iZJ2XyYASfTyS.gif
 
Throwing in my opinion.

30fps is fine. If you have a locked 30 it is a dream to play as long as it is fine tuned for that framerate. Last of Us was ussually around 20fps which IS a bit jarring. But I absolutely think making it 60fps will take away from its feel because 60 DOES make things appear faster, unless compensated and slowed down.

Most games benefit with 60fps, but there are exceptions. Much rather developer target higherres and locked 30 than stripping it down to hit 60.
 
Many casual gamers never finish games so it's not a surprise some prefer 30 FPS. It's just something to stare at for a bit then move on to the next set of shiner screenshots.
 
What??? The whole reason it is 30 FPS is itssacrificing 60 FPS for higher graphical fidelity.

Some people...

Some people...don't comprehend edits.
My question was could the order still do what it intends to do if was at 60 fps and not at 30. Not sure where you got the sacrificing part considering that's your assumption why they lowered it but they never revealed why. Other then they wanted to keep it "filmic".
 
Actually, I'd prefer we stop with the "can we stop with the whole '60fps is not cinematic'" lines. OP's is especially embarrassing. Do you even KNOW what cinematic means? It certainly doesn't mean "more life-like". It means cinema-like, meaning film-like, meaning lower framerate w/motion blur. 60fps is NOT cinematic because its motion looks even far less like cinema than 30fps does. What you're saying is you don't WANT games to look more cinematic; you want them to look more life-like. And that is a whole other matter entirely.



Are you seriously trying to ask this to catch somebody? Why on earth would the developer not take advantage of the extra resources it frees up, regardless of whether they chose 30fps for its more cinematic look?

I'm not sure you understand my question. It basically boiled down to could The Order still push for it's filmic look at 60 fps with the same amount of graphics , or does 60 fps automatically cripple it from looking "cinematic " or "filmic" . Instead of debating through sarcasm try reading the post and debating through what was written.
 
I'm not sure you understand my question. It basically boiled down to could The Order still push for it's filmic look at 60 fps with the same amount of graphics , or does 60 fps automatically cripple it from looking "cinematic " or "filmic" . Instead of debating through sarcasm try reading the post and debating through what was written.

Well, here is your answer: No, it does not.
 
Of course it isn't, that's what cutscenes are rendered at 30 fps. How is this even a discussion? You silly pc elitists are silly


*goes back to playing witcher 2 at locked 60 fps*
 
Honestly, I just took the whole "filmic" comment as a PR statement and there was nothing wrong with that. It's folly to expect devs to talk smack about console hardware. They went for 30 fps for better graphical effects and in this way it is more "filmic". Prettier visuals are better marketable via screenshots and at the end of the day they are looking to make as much money as they can with a single-player game. It doesn't matter that I personally think FPS is king. Drive Club is also 30 fps for the same reasons.

30 fps itself is not "more filmic". I will never accept that reasoning.
 
Games are designed with pre-made animations, unlike films that literally capture every nuance per frame, this is why 30fps can be so easily noticeable and "unfilmic" you can literally see even from TO:1886 that animations try to blend to be as consistent as possible and you can immediately tell whether the character is mo-capped or not. That kind of visual disconnect is so apparently obvious I wonder how and why people bother to defend this.

Let's face it, the more frames the better. If the game doesn't look that good at 60fps is because the animators simply didn't take advantage of the extra frames. It's one of the main reasons why RAGE enemy AI can look so fluid compared to "forced" 60fps that the PC had over 30fps console ports.
 
Throwing in my opinion.

30fps is fine. If you have a locked 30 it is a dream to play as long as it is fine tuned for that framerate. Last of Us was ussually around 20fps which IS a bit jarring. But I absolutely think making it 60fps will take away from its feel because 60 DOES make things appear faster, unless compensated and slowed down.

Most games benefit with 60fps, but there are exceptions. Much rather developer target higherres and locked 30 than stripping it down to hit 60.
It doesn't. Your brain just isn't used to 60fps video games.
 
I'm not sure you understand my question. It basically boiled down to could The Order still push for it's filmic look at 60 fps with the same amount of graphics , or does 60 fps automatically cripple it from looking "cinematic " or "filmic" . Instead of debating through sarcasm try reading the post and debating through what was written.
I have to imagine in a scenario where it had the same exact graphical fidelity... that's really just down to personal preference. It may not be what THEY consider "filmic" but those who don't think that undermines films feeling like films with Hobbit HFR or motion interpolation on TVs would probably still feel it looks filmic.
 
Why is this an argument? You can crank all the settings to ultra and still get 60fps when playing on pc, having worse performance in the game does not make it "more cinematic"
what? :/
Hold on why I change all the settings to ultra on my PS4 or Xbox One. Oh wait. No I can't.

On consoles there is a trade off and believe it or not most PC gamers have that same trade off too. Not everyone can run everything on ultra at 60fps, thats why PC games let you scale your own settings. So you can choose between graphics, resolution and performance.

I never said haing worse performance makes the game more cinematic, but usually cinematic games tend to go for graphics and effects over having the game run at 60fps. It's more engrossing in the story/game to have it look as good as it can.
 
Worst post today? I think so.

60fps will always be better than 30fps. Is a fluctuating 60ish better than a locked 30? That's more complicated


well it is not complicated. I'd take a 45 to 60 fps game over locked 30 any day. Especially when the game on average runes at 50 to 55.
 
The reason why we would rather have 30fps than 24fps is because 24 does not divide evenly into 60 so you'd get a lot of judder. However, at 30fps, you will see each frame exactly twice resulting in smooth motion.

That being said... I have no problem with 30fps games or 60fps games, but they must be CONSISTENTLY 30 or CONSISTENTLY 60. I hate it when the frame rate is all over the place.

Remember the Dead Space games? Those ran at 30fps and almost never deviated from that refresh... that made them very easy to play. The controls were always consistent and since the frame rate wasn't bouncing around everywhere, you hardly noticed it was only 30.

Watch Dogs is another example of a 30fps title that hardly ever changes. It's so ridiculously smooth that it's a non-issue.

However, when I play Tomb Raider DE, depending on the part I'm on, the frame rate could be 30, 45 or 60. It was inconsistent and it just feels messy.

Also, I do think 30fps looks closer to film than 60 does. Give me a constant 30fps with outstanding graphics and lighting with good motion blur and the game will play perfectly.
 
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.

When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game. It's a constant reminder that I'm playing a video game and breaks the immersion. Maybe that's what you're feeling.

I also hate that motion flow crap they put in TVs these days. Makes every show look like it was filmed on a consumer camcorder.

There are 4 kinds of people:

1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.
 
Top Bottom