Can we please stop with the whole "60 fps is not cinematic" argument.

HFR doesn't preclude motion blur, and as a matter of fact, the Hobbit films still exhibit motion blur in HFR, albeit at a reduced capacity than what the 24 fps version has.
Actually, the irony to all this "Hobbit HFR has extremely low motion blur" stuff is that The Hobbit was shot with a really long (1/64s) shutter, to split the difference with the 24fps version.

It's a faster shutter than 24fps films use, but seriously, the HFR version of the film has a ridiculous amount of motion blur for a 48fps video stream.
 
I would disagree. In daily life, if something zooms past you its perceived as a blur. 48fps movies, motionflow tv, etc, artificially remove that natural blur and force you to see moving images in a way you don't see them in reality. That's why it bothers people.

Slow motion, under ranking, high frame rate (like Saving Private Ryan's beach scene) are used to accentuate a moment, make you take notice. Having it throughout the whole movie is distracting.

As far as games: more frames is better for an interactive experience. Games also have the benefit to be able to add the motion blur in to help with making it seem "more real".
Natural motion blur is going to occur regardless. You're referencing artificial blur, which LCD monitors exaggerate.
 
Eh, not exactly. If you were to view a 10000000fps video stream that had no built-in motion blur, your eyes would apply their natural motion blur exactly as they would to a "real" scene.

There's nothing about HFR film that eliminates the manner in which your eyes time-average inputs.

Except that you are not watching a real object move. You are watching a series of images flash on a screen in slightly different positions that suggest movement. Any blur from movement is captured during filming, not on playback.
 
Except that you are not watching a real object move. You are watching a series of images flash on a screen in slightly different positions that suggest movement. Any blur from movement is captured during filming, not on playback.
Cameras are not eyes.
 
I don't particularly care for games to be cinematic in the first place, but i do care that, in Dark Souls, playing at 60fps was much easier for me to use the i-frames of the roll against the 4 Kings, and that saved me several headaches.


If there is a game that is better in 30 than 60, i haven't seen it yet.

With that said, being a graphic whore i can concede that some games don't NEED 60fps given their framerate, if it means sacrificing other graphical settings.
But even then, 60's fluidity would be ideal, in a perfect world.
 
Except that you are not watching a real object move. You are watching a series of images flash on a screen in slightly different positions that suggest movement. Any blur from movement is captured during filming, not on playback.
The reason you don't get motion blur naturally in low-framerate video is that the objects jump too much from one frame to the next to create a natural blur trailing. Your visual system still has some degree of time-averaging to the way it absorbs and perceives the world on a screen, though (you might have noticed that quickly moving your cursor around on a desktop produces the appearance of "ghost" images behind the current cursor location).

In terms of how your eyes produce motion blur, there aren't any significant differences between watching a real object move and watching extremely closely-temporally-spaced frames simulate a real object moving.

Or do you think that motion blur is a result of real-world objects actually reflecting light in a blur pattern when there's relative motion? Because that's not how motion blur works at all.
 
Im all for 60 fps in games, but a solid 30 does not bother me at all. Im perfectly happy with either one.

The dancing girl from the WebM thread at 60 fps looked really really weird though. I didn't like it at all. However, I saw it at work on work grade LCDs, not at home on my plasma or good monitor.
 
If 60fps are so much better than 30fps, how come most TV shows are 30fps? It's certainly not due to technical reasons since TVs and video capture equipment have been capable of 60fps for decades. It's not about cost either since soaps have lower budgets and are 60fps.

Once again, a higher framerate does not impede on creating a sense of realism. What needs to happen is that the production itself needs to better understand the challenges that the format presents and rise to meet them, not throw your hands up and just not bother with it. We've recently gotten 3D films where the 3D is a major and integral aspect of the film's visual language, like Gravity, and that's had just as bad a rap as HFR has had, if not worse because of how widespread rushed post-conversion jobs have been amongst major productions.

Assuming you do everything perfectly "right" as you say and it works on a film, it will not translate well to videogame graphics since they're still FAR away from photorealism, particularly in terms of character animation.

Too many ignorant people in the world to stop stupid arguments. It sounds harsh, and it is, but it is the cold hard truth.

60fps is in every way, shape, and form better than 30. People who make arguments defending, or even worse saying 30 is better are sheep. Now, an argument can be made that some games don't "need" more than 30, like rpg's for instance. That is different than saying 30 is better than 60. RPG's might not need more than 30, that is true, but that doesn't mean that 60 isn't still better in every way.

Well, now that you've put it in a definitive and insulting tone I guess it's settled.

What do you mean by "real" and by "fake"? That's the whole problem here. One of the main complaints about Hobbit HFR is that it looks like a bunch of people walking around on a set. Does it look real? Absolutely, if by "real" you mean "like the real-world set being filmed." That's not the goal of the film, though; the film wants to make The Hobbit seem real.
That's the point. My brain doesn't buy the illusion they're presenting due to in big part to the higher framerate. The higher rate of information works against it.

A substantial fraction of people I talked about Hobbit HFR with, even if they were nothing resembling experts on anything video, thought something looked wrong with it even if they couldn't articulate what.
Glad to hear we're not alone.

The reason you don't get motion blur naturally in low-framerate video is that the objects jump too much from one frame to the next to create a natural blur trailing. Your visual system still has some degree of time-averaging to the way it absorbs and perceives the world on a screen, though (you might have noticed that moving your cursor around on a desktop quickly produces the appearance of "ghost" images behind the current cursor location).

In terms of how your eyes produce motion blur, there aren't any significant differences between watching a real object move and watching extremely closely-temporally-spaced frames simulate a real object moving.

Or do you think that motion blur is a result of real-world objects actually reflecting light in a blur pattern when there's relative motion? Because that's not how motion blur works at all.

That's because LCD monitors are high-persistence displays and therefore ghosting is perceived. Low-persistence displays like CRTs don't have that issue and they never add motion blur to an image.
 
The reason you don't get motion blur naturally in low-framerate video is that the objects jump too much from one frame to the next to create a natural blur trailing. Your visual system still has some degree of time-averaging to the way it absorbs and perceives the world on a screen, though (you might have noticed that quickly moving your cursor around on a desktop produces the appearance of "ghost" images behind the current cursor location).

In terms of how your eyes produce motion blur, there aren't any significant differences between watching a real object move and watching extremely closely-temporally-spaced frames simulate a real object moving.

Or do you think that motion blur is a result of real-world objects actually reflecting light in a blur pattern when there's relative motion? Because that's not how motion blur works at all.

I am aware that how cameras capture images and how our eyes perceive motion are different, if that's what you mean.
 
That's because LCD monitors are high-persistence displays and therefore ghosting is perceived. Low-persistence displays like CRTs don't have that issue and they never add motion blur to an image.
Non-lightboost LCDs have some smearing between the current and previous frame, but that's not single-handedly responsible for the tremendous perception of ghost cursors.

Dragging your cursor around at high speed on a CRT still produces an appearance of ghost cursors.

I am aware that how cameras capture images and how our eyes perceive motion are different, if that's what you mean.
Actually, my point is that they're strikingly similar. The human visual system behaves somewhat more continuously than cameras do, but both of them produce motion blur by imposing some degree of time-averaging on incoming imagery.

I said that the human eye looking at a non-blurred absurdly-high-framerate video stream would naturally apply motion blur to the result. A long-shutter low-framerate camera filming said hypothetical video stream would also capture a motion-blurred result.
 
If 60fps are so much better than 30fps, how come most TV shows are 30fps? It's certainly not due to technical reasons since TVs and video capture equipment have been capable of 60fps for decades. It's not about cost either since soaps have lower budgets and are 60fps.

I think you might be underestimating how cheap a lot of TV productions try to run their operations, especially with how much more post-production works needs to be done these days than in the past. There's honestly not as much money going into these shows as you think there are, and you're not going to find larger types of budgets on anything other that premium channels like HBO, and even then, they try to mitigate costs as much as possible. They very much operate as carefully on the razor as possible.
 
Non-lightboost LCDs have some smearing between the current and previous frame, but that's not single-handedly responsible for the tremendous perception of ghost cursors.

Dragging your cursor around at high speed on a CRT still produces an appearance of ghost cursors.
I'm doing it in my monitor right now and no, I see distinct still images of the cursor.

Actually, my point is that they're strikingly similar. The human visual system behaves somewhat more continuously than cameras do, but both of them produce motion blur by imposing some degree of time-averaging on incoming imagery.

I said that the human eye looking at a non-blurred absurdly-high-framerate video stream would naturally apply motion blur to the result. A long-shutter low-framerate camera filming said hypothetical video stream would also capture a motion-blurred result.
The human eye doesn't apply motion blur to a high-framerate video.

I think you might be underestimating how cheap a lot of TV productions try to run their operations, especially with how much more post-production works needs to be done these days than in the past. There's honestly not as much money going into these shows as you think there are, and you're not going to find larger types of budgets on anything other that premium channels like HBO, and even then, they try to mitigate costs as much as possible. They very much operate as carefully on the razor as possible.
As I said, soaps are cheaper to produce. Why don't they switch to 30fps?
 
As I said, soaps are cheaper to produce. Why don't they switch to 30fps?

They shot on tape, which doesn't offer a lot of variety for framerate options, but was still a great deal cheaper than other methods at the time. I'm not sure how the remaining soaps that are on air are handling it these days, since the only ones that I can think of are in HD, so I can't really help you out much there.
 
They shot on tape, which doesn't offer a lot of variety for framerate options, but was still a great deal cheaper than other methods at the time. I'm not sure how the remaining soaps that are on air are handling it these days, since the only ones that I can think of are in HD, so I can't really help you out much there.

What about before the advent of digital video? The way you say it 60fps seem cheaper than 30fps.
 
I get the idea that most movies run in the 20 something fps zone, so having a game with those fps makes it feel more movie like. But IMO more fps = more life like and indicative of how we see real life.

[/I][/B]

Lol OK? So then you see the point of the argument? Visually speaking, lower frame rate, solid of course, can LOOK way better than 60fps.
 
I've always been for 60fps+ until I saw The Hobbit in HFR/48fps.

I had just thought we were conditioned to 24fps and the higher frame rate the better. Why would you not want a smoother picture? Well, it turns out 24fps despite being a technological limitation and money saving solution at the time was a blessing in disguise and gave cinema its "look". 48fps made The Hobbit look like a stage play, TV show or real life. Watching it at home in 24fps, it looked like a movie again. There's just something about it.

With video games, I figured it should always be 60fps until I played Metal Gear Solid V. There was just something very jarring about 60fps. It felt too smooth and hyper realistic. At 30fps it feels more like a movie.

I think it's going to depend heavily on the game and genre. For FPS games and silly action games, 60fps+ is the way to go since you want as fast paced action as possible. For a cinematic game like The Last of Us, I just don't know how 60fps is going to feel.
 
If you want cinematic, ask for a grain filter to be applied over the image.

Otherwise this 30FPS is "cinematic" crap is nonsense. Replace cinematic with janky, and that's what you're really trying to say. For some reason are trying people are trying to redefine Janky and stepped motion CGI as "cinematic".

No. Just no.
 
What about before the advent of digital video? The way you say it 60fps seem cheaper than 30fps.

I'm a little rusty on my soap opera production history (odd thing to have a respectable amount of knowledge about, I know), but prior to video and taping being the go-to choice for soaps, they actually broadcasted live.
 
I also disagree with the idea that I like 24 or 30 fps because that is what I am used to. Or yeah that Stockholm syndrome argument.

No I like film/cinema frame rate because of its actual qualities. And I don't mind animation or old anime that goes 20 fps either. I will not think it is shit. Cool note about the hobbit, I wondered how long each frame was. I do not like lower frame rates to the exclusion of others.

If the animation or motion frame to frame is inconsistent it will look bad. For video games frame time and input lag are issues, and action games do benefit from higher frame rates, but the motion while smoother does not necessarily appear cinematic. Which is what the thread is about.

I'll repeat, I'd like to see a game locked 24 with nice film blur, and also conversely(I have been thinking) someone implement smooth 2-3 frame interpolated motion blur for 30 fps gameplay. Maybe siggraph has something...
 
It can be. Some shows shoot at 30 or 60 digitally and spend time and money on postprocessing to make it look like it was shot at 24 on actual film.

And why would they do that if 60 fps is obviouslybetter?

I'm a little rusty on my soap opera production history (odd thing to have a respectable amount of knowledge about, I know), but prior to video and taping being the go-to choice for soaps, they actually broadcasted live.

That's too extreme of a leap into the past. My point is, 30fps TV shows and 60fps soaps have existed simultaneously for a long time. If 60fps was so much better, TV shows would be 60fps too, at least many of them would. If they're filmed at 30fps because of costs, soaps would be 30fps too since they have lower budgets.
 
And why would they do that if 60 fps is obviouslybetter?



That's too extreme of a leap into the past. My point is, 30fps TV shows and 60fps soaps have existed simultaneously for a long time. If 60fps was so much better, TV shows would be 60fps too, at least many of them would. If they're filmed at 30fps because of costs, soaps would be 30fps too since they have lower budgets.

To be honest with you, I don't know if there's really an answer to that that doesn't ultimately come down to preference. Like anything for filmed productions, HFR is a filmmaking technique that is going to suit the needs of creators better than others, and it's not going to be the standard unless it's forced into being one. I haven't been arguing for it to be forced, and I don't think I've tried to lean heavily into that direction, since I don't feel like it's the correct choice for everything going forward here, but the "allergy" towards a more widespread proliferation of the format is deeply troubling to me, who genuinely sees a lot of potential in it for productions and would like to see more people work with it, growing pains and all.

You do have to admit, at least, that stopping now with soaps, some tech demos, and the Hobbit films as representatives of what that tech is like would be insufficient.
 
To be honest with you, I don't know if there's really an answer to that that doesn't ultimately come down to preference. Like anything for filmed productions, HFR is a filmmaking technique that is going to suit the needs of creators better than others, and it's not going to be the standard unless it's forced into being one. I haven't been arguing for it to be forced, and I don't think I've tried to lean heavily into that direction, since I don't feel like it's the correct choice for everything going forward here, but the "allergy" towards a more widespread proliferation of the format is deeply troubling to me, who genuinely sees a lot of potential in it for productions and would like to see more people work with it, growing pains and all.

You do have to admit, at least, that stopping now with soaps, some tech demos, and the Hobbit films as representatives of what that tech is like would be insufficient.

The argument isn't specifically against you, but more towards those opposed to the idea that 30fps is a legitimate preference instead of brainwashing and luddism.

I will be interesting to see how this advances with time but I doubt my and many others preferences will change just like how I still love CRTs and hate LCDs.
 
Is the xbox one going to have 60fps nfl games this year? Will be interesting to see how people like that.
 
After using 120Hz monitors for ~18 months, even using 60Hz on the desktop gives me a headache. 60fps is alright in games, but 120 is better obviously. 30fps in games gives me nausea. Not trying to be a dick or anything, it just literally makes me feel unwell.

Yes. I had this experience recently. I played through AC4 on ps4, then went back and played ac3 at 120hz on pc. Once I finished ac3, I started up my ps4 to play some ac4 again and, oh god, instant headache and eyestrain, you notice every flicker like is slapping you in the face. A low framerate becomes very frustrating and then the nuesea sets in.
 
An argument that gameplay should be 60+fps and cutscenes 30fps is another matter entirely.

Wild Arms 5 did this, and I admit I liked the look.

For me I don't hate cutscenes at 60 if the camera angles are relatively stationary, or move in slow pans.

Shaky-cam or quick-cam movements at 60 make me ill. >_>
 
I have a theory (since MGSV seems to be the largest catalyst for this issue) that people complain about this because a lot of today's games use shaky-cam in cutscenes, which we've been used to seeing in TV and movies for a while now, but not so long in games, and very rarely at 60fps.
 
Wild Arms 5 did this, and I admit I liked the look.

For me I don't hate cutscenes at 60 if the camera angles are relatively stationary, or move in slow pans.

Shaky-cam or quick-cam movements at 60 make me ill. >_>
That seems pretty odd, I would say the opposite -- jerky motions at low or unsteady framerates might be really sickening, especially in virtual reality.

In addition, pretty much any time I watch a movie in a theater and they do a huge landscape pan, it is really obvious how it goes JERK JERK JERK JERK JERK JERK JERK sideways across the screen. Plus, whenever they do shaky camera for fight scenes, it is difficult for me to follow what is happening at 24 fps, but that could just be because I'm old.
 
That seems pretty odd, I would say the opposite -- jerky motions at low or unsteady framerates might be really sickening, especially in virtual reality.

In addition, pretty much any time I watch a movie in a theater and they do a huge landscape pan, it is really obvious how it goes JERK JERK JERK JERK JERK JERK JERK sideways across the screen. Plus, whenever they do shaky camera for fight scenes, it is difficult for me to follow what is happening at 24 fps, but that could just be because I'm old.

Nah, I have the same problem and I don't think I qualify as old ;) Plus, I'm pretty sure movies do it on purpose, to create a fight scene which seems dynamic and brutal without actually showing anything (case in point: The Hunger Games, I hated the camera in this movie so much...)

ITT: Games are for playing, and 60fps is better for that.
 
Games should be 60fps since its useful for movement and helps out with reactions.

Film/TV doesn't need to be higher than 24 since its a passive experience, we are only watching it and don't need extra frames to control anything.
 
Is the xbox one going to have 60fps nfl games this year? Will be interesting to see how people like that.

Sports games have almost always been 60fps. Only the PS3 version of the first madden was 30 fps vs the Xbox 360 which was 60. The nerd rage of the fans at the time was amazing.
 
Good points. But a couple of things. A lot of games does have motion blur actually. And 30 may not be the correct cinematic frame rate, but is closer to it than 60.
And if it's rock solid 30 fps, there's no stuttering.

And about the input lag, all depends in the game, if it requires fast responsiveness or not.
And it's not like 30 fps is unplayable : /

While I understand your points, it's definitely not my case.

Maybe it's just only me, but with 60 fps all seems a lot more 'gameish' to me. A cinematic game at 60 would not grow on me : /

The motion blur seen in some games is only artificially added and usually only activated to everything but the character during panning and it doesn't fix the animations or help you keep a target while you control the camera. If the proper cinematic motion blur was added to the whole scene 100% of the time it would be too blurry to play.

You say that at 30fps, there is no stuttering, but I'd say that the very thing you probably perceive as being cinematic about the framerate IS the stutter/judder...

Also, it could be argued that 48fps is more cinematic than 30fps since 48 is actually a multiple of 24 and since stuttering only really minimizes once you hit around 45fps, I would certainly pick 48fps over 30fps. 120fps is preferrable since 120 is a multiple of 24 as well as 60.
 
That seems pretty odd, I would say the opposite -- jerky motions at low or unsteady framerates might be really sickening, especially in virtual reality.

In addition, pretty much any time I watch a movie in a theater and they do a huge landscape pan, it is really obvious how it goes JERK JERK JERK JERK JERK JERK JERK sideways across the screen. Plus, whenever they do shaky camera for fight scenes, it is difficult for me to follow what is happening at 24 fps, but that could just be because I'm old.

The thing for me is that 60 fps is (presumably) closer to my normal eye speed, and my eyes do not move that quickly for fast/quick/shaky movements. My vestibular system keeps it in check for my own body movement, but when it's a disembodied camera zipping around/being shaky as fuck, nauseating.

At 30 fps there's less info being shove down my system by default so it's easier for my vestibular system to adjust.
 
It's stockholm syndrome, people have been stuck with 30fps and below so long that they're rationalizing reasons to keep it around.
^ this is it, in a nutshell. The truth right there. Pertains to both games and movies. Nothing else needs to be said.

Can't wait till 60fps is the standard in movies.
 
Sports games have almost always been 60fps. Only the PS3 version of the first madden was 30 fps vs the Xbox 360 which was 60. The nerd rage of the fans at the time was amazing.
I believe the poster is referring to the NFL broadcasting their games at 60 fps on XB1, not Madden.
 
The argument isn't specifically against you, but more towards those opposed to the idea that 30fps is a legitimate preference instead of brainwashing and luddism.

I will be interesting to see how this advances with time but I doubt my and many others preferences will change just like how I still love CRTs and hate LCDs.

16fps was the stander at one point(During the silent film era). And when it went to 24fps, not 25 or 23. It was because the disability of 24. You could easily tell how much you was cutting or adding(Half a second is 12 frames, a quarter is 6 frames, an eighth is 3 frames, etc). And also, the higher the frame-rate, the more of a headache it is to edit(When tape was involved)
 
And why would they do that if 60 fps is obviouslybetter?

A lot of it is about expectations. The way people think something is supposed to look can sometimes override the technical advantages or disadvantages of the various options available. Familiarity is a critical factor that has to be considered when making something new, whether as a feat of art or engineering.

That's too extreme of a leap into the past. My point is, 30fps TV shows and 60fps soaps have existed simultaneously for a long time. If 60fps was so much better, TV shows would be 60fps too, at least many of them would. If they're filmed at 30fps because of costs, soaps would be 30fps too since they have lower budgets.

Do you think context might matter here, just a little? 60 FPS doesn't mean the same thing for film as it does for computer graphics that are being rendered in real time and controlled by a player. There are lots of things to take into account, not a simple hierarchy of one thing being "the best" and a bunch of deluded people denying it.
 
You remind me off the Arkam Gamesbwhere the prerendered cutscenes look way worse than the game itself on a good PC.

I have exactly the thing for you. It's called Asura's Wrath, and it's awesome, get the DLC as it completes the story.

Well, I mean mostly in the sense that there does not need to be a FPS jump. I don't care for cutscenes but I do hope that, if there is a transition, it is seemless.

For example, something like this shot could have the character being pushed forward by the character and then transition into a cutscene, all at the same time visually being ambiguos as to whether or not it is a cutscene or gameplay.

iGhBxyjFoNdJ3.gif


I'm hoping that the start of this GIF is actually the player pushing the character forward. I am excite, son!
 
The argument that a slower and by consequence a more stuttery work is more cinematic seems weird. It wouldn't be any different than calling a game with black bars covering up the screen cinematic because it replicates the "wide screen" cinema experience.

Neither of these things matter though, since they both can impede the gameplay. It would be like asking someone if they would prefer a larger field of view or a smaller one, then having them reply smaller because it "feels better". You then offer them a telescope or eyepatch to walk through life using, because it would "feel better" and watch the confused look on their face afterwards when you suggest this. Alternatively, an offering of being forced to wear active shutter lenses all the time and not get annoyed by the flicker would be similar.
 
Yeah, technological threads on the internet often end up in that awful way..

Well, you did claim that you don't care what the developer of a game produces, and that you aren't a self entitled internet warrior. It just came across as implying that people who do care (i.e. people who know what they would prefer, me for example) are self-entitled. Maybe I made a leap because it's early and I haven't had coffee yet. Shrug.

Because games are objectively more responsive at a higher framerate, and also look better in motion (the state most games are in the vast majority of the time).

Oh, I see what the issue is. My bad. I meant who cares what people prefer? Why do we need to standardize that thinking "30fps is cinematic" is wrong? I remember taking sides with an animator back when SotC was announced for an HD remake (and everyone wanted 60fps) when he described and offered evidence of how 30fps offered a better sense of weight when showing two objects bouncing at the same speed at different framerates.

And movies/shows/commercials/music videos/whatever in higher frame rates are jarring to me. I liken it to when miniatures were popular in movies. No matter how detailed the miniatures were, the camera speed, position and frame rate - speed of object ratio always made for an "uncinematic" experience. It was always obvious and pulled me out. I feel the same way with actions on screen at faster framerates than I'm used to and I personally don't like it. I don't watch movies to be closer to real life. I watch it for a cinematic, celluloid dream (
corny AFI reference
)

But again, if MGR were 30 for the sake of "weight" or "cinematic experience" then fuck that. So I get preferences, but why be bothered by someone else's?

Actually, ridiculous action flicks could be cool in highr framerates. But when scenery and character drama are involved, I prefer the slower setting, personally.
 
Do people actually make that argument with a straight face? I have a very hard time believing they are not trolling, but I haven't followed the discussion about TLOU so I could be wrong.
 
I just don't get how, of all the compromises to make, movement and animation would be the things to be chopped. Motion is the state a game is in 99% of the time (exceptions of course). Why hack that, and why is it so accepted to choose less visual information? Even screen quality in stills really means nothing on how it looks in play most of the time.

I guess I'm off topic from the cinematic topic though.
 
Top Bottom