Can we please stop with the whole "60 fps is not cinematic" argument.

It think you are both arguing different points. :-P

You are saying correctly that if you filmed something at an absurdly high framerate and paused the film and looked at a single frame. That image would have NO motion blur.

Then the argument being correctly made is that if that film is being played back and there is a fast-maving object in it. The blur your eyes perceive would be identical to watching the same scene directly in real-life.

Your both right but talking about completely different things!
The reason we're having this argument is that Metroid-Squadron is actually disagreeing with the bolded, evidently believing that an absurdly high-framerate video with no built-in motion blur would be distinguishable from a continuous image.
 
There's literally nothing that I'd like to see at sub 60 fps. My media player encodes even my movies to 72 fps but now I have a hard time going to a theater to watch an action movie.
 
Then the argument being correctly made is that if that film is being played back and there is a fast-maving object in it. The blur your eyes perceive would be identical to watching the same scene directly in real-life.

Your both right but talking about completely different things!
I get his point. The problem is that there's absolutely no evidence of that hypothetical effect.
 
You people are the reason no theaters around me were showing the Hobbit in 48fps! Embrace the future, you luddites!
No, the reason theaters weren't showing it was because it looks stupid in 48fps. The reason we want games at 60fps is for the feel, but a movie is just a watching experience and you lose that movie magic when you get rid if 24fps. More frames doesn't mean better things for every medium.

On topic though: the whole cinematic argument for games is stupid because games are not supposed to be cinimatic. It's an entirely different medium.
 
The only imaginable problem I can foresee with 60fps is running games that weren't designed for it.

If you have a game that relies on animation speed, for example, changing that speed can look really weird to some people.
If the game is locked into 30fps, then you double that framerate, everything looks like it's on fast-forward.


What people don't understand, though, is that the same game, if designed for 60fps, would look leagues better than 30fps.



A faster framerate, if utilized properly, literally cannot ever be a bad thing.
 
No, the reason theaters weren't showing it was because it looks stupid in 48fps. The reason we want games at 60fps is for the feel, but a movie is just a watching experience and you lose that movie magic when you get rid if 24fps. More frames doesn't mean better things for every medium.

On topic though: the whole cinematic argument for games is stupid because games are not supposed to be cinimatic. It's an entirely different medium.
Agreed. The high framerate version of The Hobbit looked like total garbage to me, but I love 60 FPS in games. Just feels right.
 
I wish people would stop holding on to low framerates being cinematic in general. 24fps shouldn't be acceptable for.... anything.

You people are the reason no theaters around me were showing the Hobbit in 48fps! Embrace the future, you luddites!
No thnx, Dont want my movies to look like a soap opera
 
I wish people would stop holding on to low framerates being cinematic in general. 24fps shouldn't be acceptable for.... anything.

You people are the reason no theaters around me were showing the Hobbit in 48fps! Embrace the future, you luddites!

As a PC gamer and firm believer in the 60fps truth, I hated the hobbit in 48fps.
 
In any event, there's nothing inherently bad about a movie running at a higher framerate, either.
I would say the makeup, set design, stuntwork, and other practical illusion techniques do not translate to HFR as they are. So yeah movies can be fine at 48 or 60 frames per second, but outside of full CG stuff it will require rethinking many traditional film making techniques.

Games do not have this problem though. Everything is rendered. No excuse to not be 60fps unless you are hardware limited.
 
I can't wait for the day when these conversations are no longer relevant. Some day everyone will realize that native resolution and frame rate of your screen are more important than extra graphical effects.
 
The only imaginable problem I can foresee with 60fps is running games that weren't designed for it.

If you have a game that relies on animation speed, for example, changing that speed can look really weird to some people.
If the game is locked into 30fps, then you double that framerate, everything looks like it's on fast-forward.


NFS: Rivals - Don't Unlock the 30FPS limit!! (Upd…: http://youtu.be/qpC43CdvjyA

Prime example
 
You can either have 30fps which is pushing graphics and effects. Or you can have 60fps which won't look as good but will feel more responsive and look more fluid in motion.

As long as the frame rates are rock solid both choices are fine, 30fps could probably be described as more cinematic because it's focused more on looks over the responsiveness of 60fps.

But unless 60fps is rock solid I'd rather have locked 30fps, because the variation in frame rate is worse then playing a game at 30fps in most cases.
 
Are there seriously people who think that the framerate vs visuals is NOT a conscious trade-off made by the developer?

There will never be a hardware, which can do 30fps and 60fps with equal image quality. The developers are not choosing the former out of spite, but because they would rather enhance the visuals a little bit more.
 
Do we really need more than 24 fps though?

This is 24 fps:

http://a.pomf.se/gtfttm.webm

You deliberately made the camera move slow though, unless this is really a sophisticated joke for game promotional materials hiding the issue. They do the same thing in gameplay demonstrations is what I'm saying.

Basically, as an extreme example - if nothing's moving on screen you don't need more than 1fps. Or like a screenshot.
 
I can't wait for the day when these conversations are no longer relevant. Some day everyone will realize that native resolution and frame rate of your screen are more important than extra graphical effects.

No.

I'd rather have 30fps game look like Infamous than 60fps game that looks like Watchdogs. (I know Watchdogs isn't 60fps)
 
Actually, I'd prefer we stop with the "can we stop with the whole '60fps is not cinematic'" lines. OP's is especially embarrassing. Do you even KNOW what cinematic means? It certainly doesn't mean "more life-like". It means cinema-like, meaning film-like, meaning lower framerate w/motion blur. 60fps is NOT cinematic because its motion looks even far less like cinema than 30fps does. What you're saying is you don't WANT games to look more cinematic; you want them to look more life-like. And that is a whole other matter entirely.

With The Order now being capped at 30 fps for their "filmic" look do fellow gaffers think this could be obtained at 60 fps?

Are you seriously trying to ask this to catch somebody? Why on earth would the developer not take advantage of the extra resources it frees up, regardless of whether they chose 30fps for its more cinematic look?
 
For me, games are games, cinema is cinema.

After getting a good pc for the first time in my life, playing some last gen games on 60fps is a night and day difference!

I think it's just becoming a excuse now.
 
We really need to keep movies, tv shows and such out of this topic of 30fps vs 60fps, this topic is about games. While videogame graphics are improving year by year, there simply aren't any games that even come close to real photorealism so the argument that 60fps looks cheap for a videogame is silly and anyone sharing that opinion is simply being dishonest. Computer generated graphics benefit from having a higher frame rate and I don't see how people don't get that.
 
First post nails it.

Is this a new joke? I've only recently began to see this :)

But hell's yeah, frame-rate trumps resolution any time of the day for me. I don't want more of similarly colored pixels at a point in time if it means less frequent different colored pixels (pixels depicting motion by changing its colour and intensity over a period of time). That's some deep philosophical shit right there.
 
Early film usually ran at 16-24 fps (where 16 fps was the norm but film was often over-/undercranked for various reasons). If you've ever seen an old clip where the actors look sped up, it was probably shot at less than 24 fps but played back at that rate.

24 fps became standard when sound film was introduced, and as far as I know it was chosen because it was the slowest (cheapest) speed with acceptable sound quality.

There's nothing magical about 24 fps, it's just what we're used to for historical reasons.
 
Early film usually ran at 16-24 fps (where 16 fps was the norm but film was often over-/undercranked for various reasons). If you've ever seen an old clip where the actors look sped up, it was probably shot at less than 24 fps but played back at that rate.

24 fps became standard when sound film was introduced, and as far as I know it was chosen because it was the slowest (cheapest) speed with acceptable sound quality.

There's nothing magical about 24 fps, it's just what we're used to for historical reasons.
It also looks better than other fps.
 
Anyone that played competitive games can tell you that 60fps isn't there for the graphic, it is there for the gameplay.

Try play moba or fighting game in 30 vs 60, the difference is night and day.
 
I like 60fps. But on fixed hardware choices are made. It is funny that people saying that 30fps is enough get such backlash. The 60 fps army is wining and complaining way more that anything else. It is getting annoying. Let game makers, certainly on fixed hardware, make the game they want to make and judge the thing as a whole.
 
Watching != interacting. Passively observing a linear visual sequence has you perceive framerate pretty differently to interacting with a non-linear visual sequence. The former requires little investment, and allows your brain to relax and comfortably find patterns in "24 frames per second" data to form a cohesive visual sequence. The issue interactive mediums introduce is fact we're not longer casual observers; we've a natural expectation of patterns not just in visuals but visuals as feedback for interactivity. We don't "run" at a low framerate, or any framerate, so our interactivity marked with low framerate feedback can lead to a weird dissonance. In almost all scenarios your brain can and will appreciate more frequently updated visual feedback to your physical input, as that is what we're used to simply via our own existence.

An easy experiment is to play a game at 60fps and 30fps while recording footage, and going back and watching that footage. In the former scenario the footage will almost always seem oddly smooth and fast, moreso than you remember while playing. Yet playing the latter may lead to frustration as you try to line up shots and move your character, yet the footage will seem perfectly cohesive and smooth.

Watching is a passive experience: your brain only needs to find patterns in the visual data and that's it. If it works, it works, and everybody is happy. Interaction is different, your brain not passively finding visual patterns, but attempting to correlate those patterns to your own deliberate physical input that is not bound by "framerates".

An argument that gameplay should be 60+fps and cutscenes 30fps is another matter entirely.

but i never had any problems with 30 FPS, in fact i used to game on a shitty PC where the game ran at 20 FPS with no problem. what i am trying to say is that a 60 FPS in games is really not that critical(especially in some genres) , some people here makes it sound like 30 FPS is beyond unplayable at times. i played MGSV on the PS4 extensively for a week and not even a day later i started playing infamous:ss, while MGSV felt smoother it's not a day and night difference, especially control wise.
 
Anything "cinematic" needs to disappear. Almost every movie today is filmed digitally, using a few options in aspect ratio. Film grain and other affectations are used because some people want to create a "cinematic" experience while using digital. How idiotic does that sound? Frame-rates remain low because that's what we're used to, but they don't have to be like that any more.

Games are not filmed, so there's no inherit need in locking frames to simulate films. Let it go.

throw #filmic to the #trash
 
You can either have 30fps which is pushing graphics and effects. Or you can have 60fps which won't look as good but will feel more responsive and look more fluid in motion.

As long as the frame rates are rock solid both choices are fine, 30fps could probably be described as more cinematic because it's focused more on looks over the responsiveness of 60fps.

But unless 60fps is rock solid I'd rather have locked 30fps, because the variation in frame rate is worse then playing a game at 30fps in most cases.

Why is this an argument? You can crank all the settings to ultra and still get 60fps when playing on pc, having worse performance in the game does not make it "more cinematic"
 
well I don't know about cinematic but honestly Its weird playing 3rd person games in 60fps (like TR:DE) for some reason it just feels wrong IMO. but racing games really need 60+ more like 180fps.

but can we stop with the "its 60 or nothing attitude then?"
 
60fps exposes a lot of games for their bad animations and interactions as well. Not all games of course, but a lot of them just have shit looking movements that look even worse at higher framerates. Games that get it right have such a cleaner, more refined look. It's 60fps all the way for me, even from a visual standpoint. Nothing is cinematic about a low framerate by itself, so assuming you can just cap it and create "dat cinema" is a really crazy idea for me. What's actually happening on screen during gameplay and how well it runs is a much better target. Certain games are ok at low rates like survival horror, but even those can be immersive at higher ones. Something about busted, low framerate movement makes them more scary for me though most of the time.
 
People should just say "We went with a lower framerate because we wanted higher image quality, because we care more about how it looks than how it plays."

It's one thing to have a different priority, what I can't stand is the half assed bullshit justifications. Just admit the truth. It'll set you free!

Alternate acceptable truth: "We didn't want to spend a lot of money making our animation look good"
 
120 fps or bust you peasants.
This. After going 120(could do 144 but I LOVE lightboost so much and is the next best thing after GSYNC upgrade that costs way too much.) i can't go back.... Ok I can play it but it is madding to know people think its better to do 30 than 60 FPS or even say 120 is not needed. IT IS. Its just we have not seen how SMOOTH it makes gameplay and enjoy able when you have full control with no lose of input.
 
The only game I've ever preferred at 30FPS was Dark Souls, because the alternative was basically 15. Filmic is a weird term. How far back does filmic go? To silent film? Should it be sepia? Should there be a shutter noise over the game? Nothing but a retro style should go for these things. The only reason we like low FPS is because we're used to it.
 
Did you see the hobbit in 48fps? It ass fucking horrible.

I understand the reasooning when people say it is not cinematic so I really dont judge them like I'm a dictator.
 
Can people stop trying to pass their opinion off as fact? That would solve this argument. You might prefer 60fps but if the game is 30fps, thats just too bad for you. You have the option of either accepting the game the way it is or disregarding it and refusing to play it. Either or is fine but please don't whine about it to everyone. It is only going to piss off people. At that point of development, your whining will actually accomplish nothing. Nobody is doubling the framerate in the final stage of development and the devs won't delay the game to get it to 60fps. Killzone tried to do this with the multiplayer and it flopped on its face. Seriously pointless, as long as the framerate is locked then there should be no problem. There is a consistency in input latency and humans are capable of adapting.
 
60 FPS will go back to being a very good, very wanted thing if exclusive games start targeting it. We know the real reason it's suddenly a bad thing.
The manner in which the studio handled not being able to hit 60FPS is secretly fucking brilliant. Developers that can't hit 1080p on the PS4 later on in the generation due to hardware constraints should say that their games are trying to strike a "retro-gaming feel", so we can actually get people that try to argue that resolution preference is subjective as well. Some multiplatforms could try it right now!
 
Top Bottom