In defense of the "filmic" look.

Since judder is a byproduct of 24fps that affects movies as well, having minimal amounts of it seems like a good thing.

That's the whole problem though and that's the whole point of why 24fps works well for movies but does not work well for games: the judder IS what makes it more cinematic. It takes away the 'reality' feel by having too few frames per second to convince the brain that it's watching a real thing and it works well for storytelling. You experience something more like watching a series of photos that captured a story rather than looking through a window where real things are happening in real-time. All of this would rarely ever benefit games in any regard though. Games need to convince you that their reality is real and responsive. Games are not a passive storytelling experience. 24fps cinematic judder would generally be a bad thing for games. See my post further up above for more info.
 
I want to say these 30fps people are completely crazy, but I admit I can't help but cringe in those episodes of Twilight Zone that were shot with a different camera at a different FPS or whatever the case was.

They look awful....


However, I'd also say a higher FPS in film does make it look more realistic. As awful as those twilight zone episodes appear, it appears more as if you're watching it live, as if you were watching it on stage.

As far as games go, it's hard to say. I can't go back to 30FPS Dark Souls after playing it in 60... However, The cinematic nature of MGS or Shadow of the Colossus looked great and it's hard to imagine any other way.

It's almost as if, with a higher frame per second count, you can see more flaws.. more vibrating polygons and it appears more artificial.

It's up in the air when it comes to cutscenes, but with gameplay, it's gotta be higher than 30 really, especially in games with a lot of motion.
 
However, I'd also say a higher FPS in film does make it look more realistic. As awful as those twilight zone episodes appear, it appears more as if you're watching it live, as if you were watching it on stage.

Exactly. The higher the framerate, the more you have the 'live' feel. Generally speaking, this works well for games. It's hard to imagine 24fps cinematic judder benefiting a game experience, not impossible but it'd be a pretty unusual case.
 
Exactly. The higher the framerate, the more you have the 'live' feel. Generally speaking, this works well for games. It's hard to imagine 24fps cinematic judder benefiting a game experience, not impossible but it'd be a pretty unusual case.
Don't forget that your interaction with the game feels faster and more responsive which makes it feel more "real" too.
 
Don't forget that your interaction with the game feels faster and more responsive which makes it feel more "real" too.

Exactly, another way to put it would be: watching a movie, the cinematic experience of watching a fast photo flipbook can enhance storytelling. But imagine trying to *play* it and interact with it, would you still prefer a framerate that deliberately introduces judder for storytelling purposes? Unlikely. It really is all about the difference between the brain experiencing something as 'stills' (due to how we at least on a subconscious level still notice judder at 24fps) vs the brain interpreting something as 'real' (due to how high framerates provide a good enough illusion of real movement). Both have their pros and cons, but when it comes to gameplay I don't want to feel like I'm seeing a fast series of stills and I don't know why anyone would want to. Aside from potential and probably unusual artistic experiments, high framerates are the obvious way to go for games.
 
I want to say these 30fps people are completely crazy, but I admit I can't help but cringe in those episodes of Twilight Zone that were shot with a different camera at a different FPS or whatever the case was.

that's video vs film which brings with it a lot more than just framerate differences.

The 24fps subtle 'stutter' that is noticeable in film has the following repercussions:
- films are carefully shot, in many contexts fast movements and certainly particular speeds of camera panning are avoided to make sure the stutter is not really consciously noticeable
- the brain still perceives a certain level of stutter, making it feel less like you're looking through a window and more like you're watching a fast flipbook of photos, psychologically this creates a different effect that tends to work well for storytelling and this is the opposite of 'reality show' or news broadcasts where the 50 to 60 fps look tends give you a 'reality' feel rather than a 'storytelling' feel

this 'tends to work well for story telling' claim is highly interesting, and you know, pretty much pulled out of thin air.

your feelings about high frame rate seeming more 'real' and less 'story book' comes from the way they are used... not the other way around. the news is shot on video because you can't shoot live any other way. reality shows are shot on video because it's highly impractical (and massively expensive) to have cameras running around the clock shooting on film. it's much cheaper shooting to tape or hard drives, and you don't have to break every ten minutes to load film into the cameras.

your expectations are based on your familiarity with how these mediums have been used. this '24 fps works better for story telling' nonsense is nonsense.
 
that's video vs film which brings with it a lot more than just framerate differences.



this 'tends to work well for story telling' claim is highly interesting, and you know, pretty much pulled out of thin air.

your feelings about high frame rate seeming more 'real' and less 'story book' comes from the way they are used... not the other way around. the news is shot on video because you can't shoot live any other way. reality shows are shot on video because it's highly impractical (and massively expensive) to have cameras running around the clock shooting on film. it's much cheaper shooting to tape or hard drives, and you don't have to break every ten minutes to load film into the cameras.

your expectations are based on your familiarity with how these mediums have been used. this '24 fps works better for story telling' nonsense is nonsense.

We have loooong passed the era where we had to choose a framerate because of of practical limitations when it comes to tv and video. News can be shot at 24fps and drama can be shot at 60fps. The reason why we don't want to is not just familiarity, there is a clear difference between when you see something that is fluid enough to perceive it as real and something that judders enough to break the illusion.

The exact effects can always be debated but to simply dismiss them as only being down to familiarity is blatantly wrong.

edit: how about we bring an expert in here who is way more knowledgeable on the subject than you and I combined: http://www.macvideo.tv/camera-technology/interviews/?articleId=3213230
It's a tricky debate because there is a level of subjectivity in what you prefer. However, if you just label it as familiarity, you're dramatically simplifying and misrepresenting the issue. This is a big mistake in the 24fps vs 48fps for example, this idea that we just gotta get used to the 'better' framerates etc is just not entirely correct.
 
One and done. People who say they want games to look "cinematic" have no idea what they are talking about. What does that even mean? Cinema is not gaming. It's like saying, "I wish this book were more like a song."

LOL one and done... except that's not 24 fps

There is no "cinematic look" per se. It's just what you are used to see in the cinema. The decision to make films in 24p is not because it looks best for the medium, but because of cost and technical limitations. If people get used to HFR, that will be the "new cinematic look".

That is all your shitty "cinematic" look boils down to. What the tech was capable of and what people are used to.

And also, just for the record because some people actually forget that: Games is different from movies.

Wow its shocking how many people are missing the point of this thread. And lol that "shitty cinematic look" hasn't been a technical limitation for decades, there is a reason its still the standard.

Films run at 24 fps because that was the industry decided that was the minimum rate to fool your eyes. They didn't bother to make a better standard because it would cost more. So, this whole 24 fps cinematic motion is just a compromise that we got used to, not something that was considered the best for motion films or an artistic decision.

Both games and films should be better with higher frame rates. More frames = more visual information and more fluid motion. Simple as that. But games need this even more because you need responsiveness and you control the camera movement.

You misses the point as well... but anyways read above, we don't need to do 24fps anymore we choose to.
 
We have loooong passed the era where we had to choose a framerate because of of practical limitations when it comes to tv and video. News can be shot at 24fps and drama can be shot at 60fps. The reason why we don't want to is not just familiarity, there is a clear difference between when you see something that is fluid enough to perceive it as real and something that judders enough to break the illusion.

That clear difference is your familiarity to how they have been used in the past when practical limitations led one type of format to be used for one type of content.

'Judders enough to break the illusion' is meaningless nonsense.

Arguments from authority are lazy. I don't even need to bother telling you the names of much more respected directors that would agree with me because you already know them, and it doesn't change how you feel. It shouldn't. People who know more than both of us agree with me too. So... what good does that lazy debating technique do either of us?
 
'Judders enough to break the illusion' is meaningless nonsense.

It clearly isn't though. I can point out tons of moments in movies where the judder is very apparent. Are you going to tell me that it's not there and that the movement actually IS fluid enough to trick my brain into seeing it as real motion? Cause it just isn't. That's not how it works. You're reaching here to try to pin it purely on familiarity but there is clearly more to it.

Arguments from authority are lazy. I don't even need to bother telling you the names of much more respected directors that would agree with me because you already know them, and it doesn't change how you feel. It shouldn't. People who know more than both of us agree with me too. So... what good does that lazy debating technique do either of us?

I agree up to a point, though slightly to my defense: this is a cinematographer who has done way more technical work in this very specific field than most of the prominent directors who also have experience with it. But still, agree with your point, just thought I'd bring in a different voice to show that I'm not completely insane or something.
 
It clearly isn't though. I can point out tons of moments in movies where the judder is very apparent. Are you going to tell me that it's not there and that the movement actually IS fluid enough to trick my brain into seeing it as real motion? Cause it just isn't. That's not how it works. You're reaching here to try to pin it purely on familiarity but there is clearly more to it.

neither 24 fps nor 60 fps look like real motion. I have never seen any evidence that 24 fps is a magic amount of judder that better lends itself to fictional stories where 60 fps is better suited to so called 'reality' TV.
 
You misses the point as well... but anyways read above, we don't need to do 24fps anymore we choose to.

The Hobbit in 48 fps looks incredible to me, much better than 24 fps. 24 fps doesn't really look objectively better. The fact that you and others prefeer to watch things that way, is most likely a result of you having grown up watching 24 fps movies, and not 48 fps movies. If someone has watched movies a certain way their whole life, obviously it will look weird to them when the "look" of the movies all of a sudden change to have a smoother and clearer image.

And like I said a few pages back, if you let some kids grow up playing only 60 fps games, and watching 48 fps movies. Then when they get older you try to get them to play a 30 fps games and watch a 24 fps movie, what do you think their reaction would be? I mean do you really think they'd prefeer that? Or would they think it looks wrong, due to them being used to something else?

Isn't it also more expensive to film a movie in 48 frames, compared to 24?
 
neither 24 fps nor 60 fps look like real motion. I have never seen any evidence that 24 fps is a magic amount of judder that better lends itself to fictional stories where 60 fps is better suited to so called 'reality' TV.

Sorry, 60fps is definitely enough to feel like it is real motion almost all of the time. Judder at 24fps is way more apparent and obviously breaks that illusion a LOT more. I'm not saying 24fps is a magical number (22fps and 25fps works too for example), I'm just saying that the *dramatic* increase of judder you get at that rate clearly has a particular effect that goes way beyond familiarity.

With your arguments, you could also say that 10fps is clearly always bad because it is so low. But it is not. Low framerates such as 10fps also have a particular psychological effect, which is why they are sometimes used in dramatic flashbacks.

Simply stating that higher is always better and any preference is down to familiarity is wrong imo.
 
How many fps does real life run in?
Depends on the lighting condition, but a lot more than 60.

As someone else said, we are used to movies at 24-30 fps so anything that's more than that feels weird, including games, it's not like games are different. 60fps are like HD resolution: once you get used to that it's hard to go back, but it's sufficient to watch a few extreme sports videos (thus, fast paced) in 30fps and then in 60fps to notice that there's a world of difference, and the latter are a lot better.
 
Sorry, 60fps is definitely enough to feel like it is real motion almost all of the time. Judder at 24fps is way more apparent and obviously breaks that illusion a LOT more. I'm not saying 24fps is a magical number (22fps and 25fps works too for example), I'm just saying that the *dramatic* increase of judder you get at that rate clearly has a particular effect that goes way beyond familiarity.

Why do you believe this though? I mean, what are you basing it on?

Logically I struggle to wrap my head around the argument that in order to suspend my disbelief that I need MORE artificiality inserted between me and the film or game.

We need to break the illusion of realistic movement to present believable stories? Since when? I fully believed in the stories when the BBC was still shooting predominantly on video. Hell, if you'd have asked me whether things looked better when they were inside or outside I'd have said inside.

In the UK 24p doesn't represent 'more condusive to story' it represents a more expensive look. This real/not real association doesn't seem to exist, and that to me says that our emotional reaction to framerate is indeed based on familiarity.

Our dramas *were* cheaper than the 24 fps content getting shown along side them on TV.
 
The Hobbit in 48 fps looks incredible to me, much better than 24 fps. 24 fps doesn't really look objectively better. The fact that you and others prefeer to watch things that way, is most likely a result of you having grown up watching 24 fps movies, and not 48 fps movies. If someone has watched movies a certain way their whole life, obviously it will look weird to them when the "look" of the movies all of a sudden change to have a smoother and clearer image.

And like I said a few pages back, if you let some kids grow up playing only 60 fps games, and watching 48 fps movies. Then when they get older you try to get them to play a 30 fps games and watch a 24 fps movie, what do you think their reaction would be? I mean do you really think they'd prefeer that? Or would they think it looks wrong, due to them being used to something else?

Isn't it also more expensive to film a movie in 48 frames, compared to 24?

No it's not lol, not to the point where it would deter a director if they really cared... hell companies are shelling out millions for minor things, you don't think they'd shell out a little more if the director WANTED that look. Gimme a break you can't honestly believe that. Maybe you thought the 48FPS looked awesome, I didn't and maybe it is familiarity, but all I know is the greatest filmmakers of all time chose 24 and CONTINUE to choose 24, and it's more than familiarity, that's nonsense.

Sorry, 60fps is definitely enough to feel like it is real motion almost all of the time. Judder at 24fps is way more apparent and obviously breaks that illusion a LOT more. I'm not saying 24fps is a magical number (22fps and 25fps works too for example), I'm just saying that the *dramatic* increase of judder you get at that rate clearly has a particular effect that goes way beyond familiarity.

With your arguments, you could also say that 10fps is clearly always bad because it is so low. But it is not. Low framerates such as 10fps also have a particular psychological effect, which is why they are sometimes used in dramatic flashbacks.

Simply stating that higher is always better and any preference is down to familiarity is wrong imo.

Agreed completely.
 
If at this point we are arguing 30fps vs 24fps for traditional media as film, I feel like people who work in that field everyday can give us a better explanation why. I mean, editors and videographers who works on TV at 30fps constantly see it, but they choose to shoot at 24fps when working on film. These people see it everyday, I would think if 30 felt even better than 24 to them they would be rallying behind 60 with James Cameron.

There's framerate conformation software built into Adobe Premiere to change 25 to 24 or 30 to 24. If there's tools to make in 24fps there has to be some merit to that frame rate on how it creates a specific look.

Games is a whole different ball game. as most games actually task you with doing something base on your reaction time. And that's not the discussion most of us are having now, but it's move to how we can discredit 24fps for general media. I don't think framerate is a one size fit all, and it shouldn't be.
 
Why do you believe this though? I mean, what are you basing it on?

Logically I struggle to wrap my head around the argument that in order to suspend my disbelief that I need MORE artificiality inserted between me and the film or game.

We need to break the illusion of realistic movement to present believable stories? Since when? I fully believed in the stories when the BBC was still shooting predominantly on video. Hell, if you'd have asked me whether things looked better when they were inside or outside I'd have said inside.

In the UK 24p doesn't represent 'more condusive to story' it represents a more expensive look. This real/not real association doesn't seem to exist, and that to me says that our emotional reaction to framerate is indeed based on familiarity.

Our dramas *were* cheaper than the 24 fps content getting shown along side them on TV.

Why does it look more expensive? Didn't you just kind of prove the point that 24FPS overall for whatever reason appears of higher quality?
 
it's sufficient to watch a few extreme sports videos (thus, fast paced) in 30fps and then in 60fps to notice that there's a world of difference, and the latter are a lot better.

That's a very interesting point and imo pokes big holes in the familiarity argument. When it comes to news or reality shows and sports, people *easily* prefer 60fps. How come then that traditional cinematic experiences often (not always but often) look wrong to a lot of people (not all but often a majority)? Is our sense of familiarity with these framerates so selective? Can we somehow not appreciate higher framerates due to familiarity only when it's about cinematic works while immediately appreciating them when applied to sports news broadcasts? I don't think familiarity can work like that.

Dropping framerates down to very low numbers clearly has a specific psychological effect. How much this is at play depending on the higher you increase it can be debated but it is there. And IF there is a storytelling or a stylistic effect associated with low framerates (which I believe there is), it would be one that is in a sense comparable to dramatic cinematic depth of field: it can work in cutscenes but it would in most cases be detrimental to the gameplay. You don't want it applied everywhere in your game. And certainly since games don't have filmic motion blur, high framerates are even more optimal most of the time.
 
Wow its shocking how many people are missing the point of this thread. And lol that "shitty cinematic look" hasn't been a technical limitation for decades, there is a reason its still the standard.

The reasoning behind that is valid, but it's nothing more than an after thought. It's a theory put in practice AFTER people got used to 24FPS. It wasn't like they used 24FPS becuase of that theory. It could be true, but most certainly, it isn't. Feed people 60FPS movies over two decades and then ask them again how they feel about 24p and how their reactions are. It's something that is basically impossible to do research on to achieve valid data.
 
Can't believe there are still some that ascribe to this notion. 24 feels. " right" just like how your voices pitch sounds "just right". You're not used to any other way.
 
Why does it look more expensive? Didn't you just kind of prove the point that 24FPS overall for whatever reason appears of higher quality?

Because like I said, the 24 FPS content that was shown against our homegrown 50 FPS content *was* more expensive. The movies and American TV shows had higher budgets, better costumes, better sets, etc.

Doctor Who in the eighties didn't look cheap compared to ST:TNG because it wasn't 24 fps. It was because it was cheap compared to ST:TNG.

UK shows wanted to look more like American shows. So they switched to 24 which people associated with higher production values. Because of familiarity.
 
Because like I said, the 24 FPS content that was shown against our homegrown 50 FPS content *was* more expensive. The movies and American TV shows had higher budgets, better costumes, better sets, etc.

Doctor Who in the eighties didn't look cheap compared to ST:TNG because it wasn't 24 fps. It was because it was cheap compared to ST:TNG.

UK shows wanted to look more like American shows. So they switched to 24 which people associated with higher production values. Because of familiarity.

Because of familiarity? Wouldn't changing from the most popular shows like dr. who, cause a very jarring reaction? Like "what the hell am i watching?!"
 
The reasoning behind that is valid, but it's nothing more than an after thought. It's a theory put in practice AFTER people got used to 24FPS. It wasn't like they used 24FPS becuase of that theory. It could be true, but most certainly, it isn't. Feed people 60FPS movies over two decades and then ask them again how they feel about 24p and how their reactions are. It's something that is basically impossible to do research on to achieve valid data.

I don't think anyone in the generations in the UK who grew up regularly seeing 25 fps and 50 fps fiction would have trouble suspending their disbelief in 50 or 60 fps fictional content.

I mean, I'm very glad America shot most everything on film, but that's more because you can blow that shit up to HD and it looks amazing.

I wish Doctor Who would go back to 50 fps... but I get why they won't any time soon.

Because of familiarity? Wouldn't changing from the most popular shows like dr. who, cause a very jarring reaction? Like "what the hell am i watching?!"

I'm not sure what you mean here. You don't hear people in the UK talk about 'soap opera effect' or 'fast forward' with The Hobbit or whatever because they've seen way more content at 50 fps. You will hear them talk about 50 fps feeling 'cheaper' because the content they are familiar with shot at 50 fps WAS cheaper than the content they are familiar with shot at 24/25.
 
No it's not lol, not to the point where it would deter a director if they really cared... hell companies are shelling out millions for minor things, you don't think they'd shell out a little more if the director WANTED that look. Gimme a break you can't honestly believe that. Maybe you thought the 48FPS looked awesome, I didn't and maybe it is familiarity, but all I know is the greatest filmmakers of all time chose 24 and CONTINUE to choose 24, and it's more than familiarity, that's nonsense.
Wouldn't cinemas need to upgrade their projectors? I know that my cinema wasn't able to screen the hobbit at 48fps. Blu rays also seem to be at their limit since you wont find a 48fps version of it.

I mean, I agree that if all the directors would start pushing it it would become the new standard but there are also some technical problems that are keeping it back. 3D also needed Avatar to finally break through
 
I don't think anyone in the generations in the UK who grew up regularly seeing 25 fps and 50 fps fiction would have trouble suspending their disbelief in 50 or 60 fps fictional content.

I mean, I'm very glad America shot most everything on film, but that's more because you can blow that shit up to HD and it looks amazing.

I wish Doctor Who would go back to 50 fps... but I get why they won't any time soon.



I'm not sure what you mean here.

I mean shouldn't the general public have been used to that "cheap look", and when seeing the "expensive look" thought it looked crap due to the drop in frames.
 
I mean shouldn't the general public have been used to that "cheap look", and when seeing the "expensive look" thought it looked crap due to the drop in frames.

well speaking for myself, I always thought things looked crap when they went outside in a UK TV show and it switched to shooting on film, but that wasn't really down to framerate.

The 'cheap look' vs 'expensive look' association is based on the fact that the cheap content was shot on video and the expensive content was shot on film.

video = cheap and film = expensive is not inherent. it's not something that you plop a person down in front of the two and that they will come to all by themselves I don't think. but if you 'train' them with more expensive content shot on film against cheap content shot on video, I think, yes, they will come to associate the video look with 'cheapness'.

or to put it another way... most people in the UK vastly prefer to watch the first few seasons of Red Dwarf in their original forms at higher framerates than the 're-mastered' version that were treated to look like film.
 
No it's not lol, not to the point where it would deter a director if they really cared... hell companies are shelling out millions for minor things, you don't think they'd shell out a little more if the director WANTED that look. Gimme a break you can't honestly believe that. Maybe you thought the 48FPS looked awesome, I didn't and maybe it is familiarity, but all I know is the greatest filmmakers of all time chose 24 and CONTINUE to choose 24, and it's more than familiarity, that's nonsense.

24 fps is preferential for economic reasons though. You need bigger render farms to render special effects in 48 fps as quickly as 24 fps, you can't charge increased ticket prices for 48 fps movies. Without knowing for sure, I'd imagine you'd also need to invest a lot of money in new film equipment to be able to shoot 48 frames, just like you had to use new equipment to film 3d. Unlike 3d however, you don't have increased ticket prices to incentivize that investment.

And just so you know, 24 fps didn't become the movie standard due to it looking the best. It became the standard because it allowed proper sound quality at a reasonable price back when sound films became a thing.

Also who is it that you consider the greatest filmakers of all time? Because filmakers like Peter Jackson and James Cameron quite like 48 fps movies. And these guys are pretty decent at making films.
 
Guessing almost everyone here who thinks 24-30 fps is fine is a console gamer and aren't used to 60 fps.

That is probably all what it comes down to. Really. At least 75% of the nay sayers would be gone if they would just play 60FPS on a very regular base or solely. My perception wasn't like that a few years ago, I always was sensible to framerate, sure, but I was fine with 30FPS. That simply evolved because I fed my brain different impressions over a long period of time. I'm sure that would work for anybody, probably.
 
this 'tends to work well for story telling' claim is highly interesting, and you know, pretty much pulled out of thin air.
Not necessarily. It's not much of a stretch to think that something that intentionally steers away from the real look can be used to set the stage for a fictitious presentation. Same way color grading, b/w scenes etc are used for a particular mood setting purpose. Then there's also what I wrote on in some other thread, that our vision doesn't actually perceive things perfectly smoothly when we move our heads around, as our eyes have micromovements that fixate for splits of second onto points in space as we look around, creating some visual stutter. I think that in particular is the main reason why high framerates look artificial to so many people, even though common sense tells you they shouldn't.
 
OP is right to an extent. The overall "look" of a moving image is very dependent on the techniques used to capture or create it, since different methods can affect everything from super obvious qualities (black and white versus color) to really subtle ones like the way colors are reproduced when using a certain type of film.

It logically follows that a game running at 30 frames per second and intentionally reproducing certain camera effects will look more like a Hollywood film than one running at 60 and aiming only to reproduce effects experienced by the human eye when viewing something in person.
 
Not necessarily. It's not much of a stretch to think that something that intentionally steers away from the real look can be used to set the stage for a fictitious presentation. Same way color grading, b/w scenes etc are used for a particular mood setting purpose. Then there's also what I wrote on in some other thread, that our vision doesn't actually perceive things perfectly smoothly when we move our heads around, as our eyes have micromovements that fixate for splits of second onto points in space as we look around, creating some visual stutter. I think that in particular is the main reason why high framerates look artificial to so many people.

But look at what you just wrote.

'high framerates look artificial to so many people'.

Ergo 24 fps which steers away from the 'real look' can be used to set the stage for fiction?

See, that's what I don't remotely follow.
 
That is probably all what it comes down to. Really. At least 75% of the nay sayers would be gone if they would just play 60FPS on a very regular base or solely. My perception wasn't like that a few years ago, I always was sensible to framerate, sure, but I was fine with 30FPS. That simply evolved because I fed my brain different impressions over a long period of time. I'm sure that would work for anybody, probably.

Yup 3 years ago went from the usual 25-30 fps of xbox 360 games to a newly built pc playing all games at 60 fps... can't go back to 30 now.
 
But look at what you just wrote.

'high framerates look artificial to so many people'.

Ergo 24 fps which steers away from the 'real look' can be used to set the stage for fiction?

See, that's what I don't remotely follow.
I know, it's more that I wrote two things to try and explain the same phenomenon. I worded it badly, I admit. The latter is something I learned about pretty recently, and by observing what it looks like when I look around it really does make sense (i.e. when I look around it really doesn't feel the same way as when the camera moves around in the 60FPS games)
 
LOL one and done... except that's not 24 fps



Wow its shocking how many people are missing the point of this thread. And lol that "shitty cinematic look" hasn't been a technical limitation for decades, there is a reason its still the standard.



You misses the point as well... but anyways read above, we don't need to do 24fps anymore we choose to.

I love it when someone has evidently has no clue what he is talking about. Film is hold back by the 24 fps/35 mm standard. It has become a standard because 1. It is cheaper 2. It is made with homevideo in mind. There was never a homeformat, not even bluray, which was able to provide films at their highest quality. Look up what Ebert wrote, look up what Trumbull says. Films are hold back. Support visions, not compromisses.

And again, we are talking about games. 1886 has the cinematic excuse, Driveclub... I don't know anymore. Excuses don't change the fact that 60 fps always was better for games.
Noone will complain that TLoU PS4 won't be cinematic enough.
Noone will complain that Halo won't be cinematic enough.
Noone will complain that Metal Gear won't be cinematic enough.
You are buying into a bullshit excuse.
 
I know, it's more that I wrote two ways to try and explain the same phenomenon. I worded it badly, I admit.

I struggle with the following, which I am going to frame as sensibly as I have seen the argument presented:

"Because 60 FPS looks more real, it makes all the unreality of movie making more obvious, and therefor it makes the film seem less real."

"Because 24 FPS looks less real, it gives the viewer more room to suspend their disbelief and makes the film seem more real."

When I watch The Hobbit in HFR, or a watch a play, or a classic British TV show, I can easily suspend my disbelief and fully engage in them.

When I watch reality TV I can still see how painfully staged most of it is, despite it being shot the same way as sporting events and the news.

I do not believe I have special abilities in either of these cases. For many people 24P is a cue to suspend their disbelief. It's a cue they are used to. They react to the content differently as a result... but it's not the only valid cue.

'The sets look like sets' was one of the weirdest criticisms I heard about The Hobbit in HFR. As if you couldn't clearly tell what was a set and what wasn't in 24 FPS if you were looking for it.

We will all easily suspend our disbelief in fiction without such cues as framerate, and we will enjoy the higher detail, and ultimately be more convinced in the fictional worlds presented too us after we get over the hurdle presented by learning to get used to a new cadence of framerate.

Noone will complain that TLoU PS4 won't be cinematic enough.
Noone will complain that Halo won't be cinematic enough.
Noone will complain that Metal Gear won't be cinematic enough.
You are buying into a bullshit excuse.
People will. And actually, already have, but that's by the by. They are a minority and their numbers will only shrink as people get used to 60 fps content.
 
Is there any high frame rate porn out there? Serious question.

Well, plenty is probably shot at 30 on camcorders and not filmized, so that's sitcom levels of high framerate at the very least.

Soap operas had high framerates because early ones would broadcast live at the NTSC standard of 60 interlaced frames per second because it was cheaper than shooting on film. This isn't universal any more, but it still wound up becoming part of the way people expected the genre to look.
 
I can appreciate intentional limitations as a stylistic choice, even if I don't care for the style in question. After all, I still appreciate retro games and pixel art and hardly think that a preference for antiquated techniques is synonymous for a rejection of progress.

But interactivity is the whole point of video games, and chopping the framerate in half or more when you don't have to just gets in the way of that core element. Intentionally sabotaging interactivity, responsiveness, or clarity for visual style is just bizarre to me. But I do suppose it's somewhat fitting for an era where so much time and money gets poured into creating cinematic experiences instead of games, and the mass market would more accurately be described as consumers than players.
 
I struggle with the following, which I am going to frame as sensibly as I have seen the argument presented:

"Because 60 FPS looks more real, it makes all the unreality of movie making more obvious, and therefor it makes the film seem less real."
Well, that's the thing. I used to think 60FPS is supposed to look more real, but I'm not sure that's necessarily the case actually.

Try it: if you move your head around, you don't really perceive the continuous perfectly even fluidity of a game camera moving at a high framerate. It in fact looks kind of stuttery, as your eyes fixate for milliseconds on different points in space as you look around. I'm almost positive this is why people think 60FPS looks 'sped-up' or whatever.

Intentionally sabotaging interactivity, responsiveness, or clarity for visual style is just bizarre to me.
While the controller responsiveness is reduced, it doesn't have to be by much at all. The best case 30FPS controller responsiveness is only slightly worse than the usual case 60FPS game controller responsiveness. It especially doesn't matter much on slower moving games. Like for example, Limbo was 30FPS on X360 and it was perfectly playable. On the other hand, something that scrolls fast like Resogun, feels like garbage at 30FPS (if you play it remote over Vita). So game design factors into this a lot.
 
Who cares about frame rate preferences in film? All gamers should care about is STABLE 30 fps or 60 fps in their games. The fact that people are even suggesting 24 fps as acceptable in a game is ridiculous. Keep film influences out of my gaming experiences, Nintendo's 1st party has done just fine without it; they're some of the highest rated games around. This trend of developers trying to make their games more film-like has me worried about the future of gaming as a whole.
 
No it's not lol, not to the point where it would deter a director if they really cared... hell companies are shelling out millions for minor things, you don't think they'd shell out a little more if the director WANTED that look. Gimme a break you can't honestly believe that. Maybe you thought the 48FPS looked awesome, I didn't and maybe it is familiarity, but all I know is the greatest filmmakers of all time chose 24 and CONTINUE to choose 24, and it's more than familiarity, that's nonsense.




Agreed completely.

It's actually quite a bit more expensive. Especially in post where the amount of frames is considered in the bid.

Pretty easy to see where the costs add up if you think about it.
 
Who cares about frame rate preferences in film? All gamers should care about is STABLE 120 fps in their games. The fact that people are even suggesting 24 fps as acceptable in a game is ridiculous. Keep film influences out of my gaming experiences, Nintendo's 1st party has done just fine without it; they're some of the highest rated games around. This trend of developers trying to make their games more film-like has me worried about the future of gaming as a whole.
Fixed it for you.

Seriously though, that's where it's at.
 
Well, that's the thing. I used to think 60FPS is supposed to look more real, but I'm not sure that's necessarily the case actually.

Try it: if you move your head around, you don't really perceive the continuous perfectly even fluidity of a game camera moving at a high framerate. It in fact looks kind of stuttery, as your eyes fixate for milliseconds on different points in space as you look around. I'm alsmost positive this is why people think 60FPS looks 'sped-up' or whatever.

This is also correct. Video games often prioritize fluidity over realism, and the resulting picture is something that's not directly analogous to either the human eye or any common type of video recording. People very well might find it odd-looking, though this doesn't preclude it from being *practical* at all.

It's also worth noting that there are ongoing efforts to make video games look more real via VR and eye tracking. The idea is to increase the rendering resolution at the spot where your eyes are focusing to replicate the relatively narrow range where human vision is the sharpest.
 
Guessing almost everyone here who thinks 24-30 fps is fine is a console gamer and aren't used to 60 fps.

Wrong. Definitely do a TON of console gaming, but I've been a PC gamer since the beginning as well. Favorite FPS all time is Quake III

That is probably all what it comes down to. Really. At least 75% of the nay sayers would be gone if they would just play 60FPS on a very regular base or solely. My perception wasn't like that a few years ago, I always was sensible to framerate, sure, but I was fine with 30FPS. That simply evolved because I fed my brain different impressions over a long period of time. I'm sure that would work for anybody, probably.

Didn't happen to me.
 
There is no defense for the 'filmic' look. It's just an excuse to make graphics-heavy games that often don't have the gameplay to match while sacrificing core ideals of interaction and continue the destructive trend of big-budget excess that has conditioned more gamers to expect too much from games on the superficial end while killing off real innovation in nuts 'n bolts designs that mid-tiers thrived off of.
 
And another thing to consider is controlling the camera with the mouse vs. controlling it with the analog stick.

Controlling the 30FPS camera with the mouse feels laggy as hell. Controlling the 30FPS camera in the same exact game with the analog stick doesn't feel nearly as bad, as you don't have the continuous 1:1 movement- to-visual update mapping that you have with the mouse.
 
Top Bottom