In defense of the "filmic" look.

People that walk past a 16:9 screen showing a 4:3 news report like nothing's wrong scare me more.

Once I saw a letterboxed 16:9 image being shown on a 16:9 screen
Well, if it's just some stupid thing running on a TV in public there's really not much to do other than complain to the people who run it, who admittedly may be those scary ignorant types.

Meanwhile the letterboxed 16:9 can be due to something like running an old DVD that DOESN'T zoom in well for whatever reason, maybe it's just slightly off or the subtitles get blocked out as the case was on Perfect Blue and Gunbuster's last episode. So, just suck it up and sel with the obscene amount of black space.
 
Believe it or not filmic IS a real word. If it wasn't I'd have been shooting it down beforehand.

And even if he got some of the technical details wrong I still think he makes a good point: there does seem to be some games that are more concerned with being these cinematic experiences or whatever rather than actually making interesting games. That's kind of why The Order set people off though, it's the most naked attempt at focusing on this story driven focus we've seen of any game that's still ostensibly trying to be a more gamey game like a shooter. So for those of us tired of that approach it can be an easy target, whether we full on attack or just go "uhh, is this really the best way to go about all of this?"

No that's not why people attack it though, and sure there may be a handful of people who do, but the point of this thread and why i brought this up is because the whole "LOL lower frame rate=film look gimme a break". I honestly believe that 75-90% of these people have no idea that movies play at 24P or even what that means.

Sure when a developer says they lowered the frame rate for a more cinematic feel it can be spin. However the people laughing it off as such, don't understand that they idea of lower framerates looking more cinematic is based on some fact here. If you put a 30FPS cutscene from TLOU vs. a 60FPS clip, you will see a difference, and based on what we are used to based on the industry standard, the 30FPS would appear more "film like" to the average person.
 
Maybe someone already mentioned this, but note that films are filmed with film. Yeah, that sounds stupid, but remember that games are normally rendered once per frame, half the time without motion blur, the other half of the time with really bad fake motion blur. When you watch a film, it feels smooth because everything moves slowly and the motion blur creates the illusion of smooth motion. When you play a game without motion blur and everything naturally jerks around really fast, 24 FPS is not enough to make things look smooth.

Personally, I'm in the "more FPS is better" camp. I want 144 FPS monitors. But anyone advocating the filmic look in games anyways, remember that you are arguing for 'temporal supersampling' (for every frame, multiple frames are rendered and composited/blended together to create 'true'(er) blur), because games won't look filmic without that.
 
Stable frame rate is more important to me than a high frame rate. GOW 3 ran at 30 fps (which is what The Order will run at according to Ready at Dawn), and it was a smooth experience. I don't expect anything different from the team that worked on 2 of the best PSP games ever.
 
We're at a point now where everything, (movies and games) should be 60hz minimum. That said, I personally don't mind 30fps, but 24 can eat a dick. Pull down is the dumbest thing ever.

Once more content is created in high framerate, 24p films will feel archaic, and less filmic.

What I'm really waiting for, is for youtube to start supporting hfr.
 
I'm sorry but this opinion is slightly uninformed. I haven't read the whole thing because it's long. But I got up to this point and I have to say it is just plain wrong:

There is no 48fps in cinema except for Peter Jackson's uglyfest. It is ONLY necessary to flicker at 48hz or even 72hz because of the way film (as in celluloid) works. It is something to overcome the limitation of that technology but the end result is still exactly the same: 24 frames per second.

Err... isn't that what he's saying, though? If each frame is blanked two or three times, you are effectively showing the same frame two or three times, leaving you with a flicker rate of 48hz or 72hz. Despite using some imprecise language, I think it's pretty clear that he wasn't suggesting there were 48 unique frames per second.

As to how this is relevant, well, isn't this similar to the way that a 30fps game operates on a 60hz display and also why you generally don't want to play your 30fps games on a 30hz display?
 
I'd just like to add one more note...

When I was a kid making movies in the 90s, we only had interlaced video cameras working, with each interlaced 'frame' at 60fps or 50fps. We were chasing the dream, trying to make Star Wars or Jurassic Park, the stuff we saw on the big screen, but our movies looked like cheap soap operas on TV.

The first thing we did was crop our films to cinemascope 2.35:1, or even 16:9, rather than 4:3.

The next thing was to throw away half of our interlaced fields to make it 25fps. The picture quality took a dive, but damn it looked closer to a real movie.

It's just interesting that we're talking a very similar thing here in 2014, chasing the dream of a truly 'filmic' experience in video games. That's the argument for 24fps. It's better or worse. It's a dream of being a movie.

It would be stupid for all video games to go for a filmic look, because video games are video games, but I wouldn't mind a game every once in a while that was like playing a movie. It's just an aesthetic choice in the end.

But to say '24fps and filmic games are crap' is really nothing more than saying 'cel shading is crap' or 'call of duty story is crap'. It's just plain snobby.


edit:
Err... isn't that what he's saying, though? If each frame is blanked two or three times, you are effectively showing the same frame two or three times, leaving you with a flicker rate of 48hz or 72hz. Despite using some imprecise language, I think it's pretty clear that he wasn't suggesting there were 48 unique frames per second.

As to how this is relevant, well, isn't this similar to the way that a 30fps game operates on a 60hz display and also why you generally don't want to play your 30fps games on a 30hz display?

I think it was what he/she said - "just because film is 24fps does not mean it is good enough for smooth playback on its own." - that made zero sense. It is good enough for smooth playback on its own. Imprecise language is exactly what leads to the confusion we're seeing throughout this thread.

And again, it is definitely irrelevant. Why wouldn't you play your 30fps on a 30hz display?
 
Stable frame rate is more important to me than a high frame rate. GOW 3 ran at 30 fps (which is what The Order will run at according to Ready at Dawn), and it was a smooth experience. I don't expect anything different from the team that worked on 2 of the best PSP games ever.
We're through the looking glass, people.
 
No that's not why people attack it though, and sure there may be a handful of people who do, but the point of this thread and why i brought this up is because the whole "LOL lower frame rate=film look gimme a break". I honestly believe that 75-90% of these people have no idea that movies play at 24P or even what that means.

Sure when a developer says they lowered the frame rate for a more cinematic feel it can be spin. However the people laughing it off as such, don't understand that they idea of lower framerates looking more cinematic is based on some fact here. If you put a 30FPS cutscene from TLOU vs. a 60FPS clip, you will see a difference, and based on what we are used to based on the industry standard, the 30FPS would appear more "film like" to the average person.
Actually I'm pretty sure most people know, it's less "that's dumb if you think you can look more like a film by lowering the fps" and more "why the hell do you want to look like a movie so bad anyway?" There may be a taste to whether you prefer it at 30 or 60, but the REAL reason you should do it (and probably the real reason they ARE doing it, or at least the primary benefit they're not afraid to take advantage of) is that you can squeeze a lot more fidelity into the game picture. Same with the letter boxing, by going with a smaller effective resolution they can increase the graphical fidelity.
It's just interesting that we're talking a very similar thing here in 2014, chasing the dream of a truly 'filmic' experience in video games. That's the argument for 24fps. It's better or worse. It's a dream of being a movie.

It would be stupid for all video games to go for a filmic look, because video games are video games, but I wouldn't mind a game every once in a while that was like playing a movie. It's just an aesthetic choice in the end.

But to say '24fps and filmic games are crap' is really nothing more than saying 'cel shading is crap' or 'call of duty story is crap'. It's just plain snobby.


edit:

I think it was what he/she said - "just because film is 24fps does not mean it is good enough for smooth playback on its own." - that made zero sense. It is good enough for smooth playback on its own. Imprecise language is exactly what leads to the confusion we're seeing throughout this thread.

And again, it is definitely irrelevant. Why wouldn't you play your 30fps on a 30hz display?
I'm going to go ahead and be a snob then and say, yes, 24fps in games is complete bullshit and if you actually aim for that in development on modern console hardware you really, really suck.

Fortunately, no game's actually doing that, The Order may be going for a filmic look but they already gave that a spin and wrote that off, going for 30 instead. There's really not much reason to do it and no good one if you're dependent on skill based gaming especially when fine aim is demanded: 120hz displays aren't the standard so you'll get judder if you don't employ 2:3 pulldown, and I have to wonder if that may introduce input lag. At any rate even if it won't look horrible there's really no good reason given how few FPS you'd shave off and the fact it'd be THAT much smoother for it.
 
People that walk past a 16:9 screen showing a 4:3 news report like nothing's wrong scare me more.

Once I saw a letterboxed 16:9 image being shown on a 16:9 screen

This is not that uncommon, for instance, without the HD service, Comcast sends a 4:3 picture of everything and 16:9 content is letterboxed.
 
Actually I'm pretty sure most people know, it's less "that's dumb if you think you can look more like a film by lowering the fps" and more "why the hell do you want to look like a movie so bad anyway?" There may be a taste to whether you prefer it at 30 or 60, but the REAL reason you should do it (and probably the real reason they ARE doing it, or at least the primary benefit they're not afraid to take advantage of) is that you can squeeze a lot more fidelity into the game picture. Same with the letter boxing, by going with a smaller effective resolution they can increase the graphical fidelity.

I'm going to go ahead and be a snob then and say, yes, 24fps in games is complete bullshit and if you actually aim for that in development on modern console hardware you really, really suck.

Fortunately, no game's actually doing that, The Order may be going for a filmic look but they already gave that a spin and wrote that off, going for 30 instead. There's really not much reason to do it and no good one if you're dependent on skill based gaming especially when fine aim is demanded: 120hz displays aren't the standard so you'll get judder if you don't employ 2:3 pulldown, and I have to wonder if that may introduce input lag. At any rate even if it won't look horrible there's really no good reason given how few FPS you'd shave off and the fact it'd be THAT much smoother for it.

I agree definitely that the 30FPS can be used to great effect. 60 FPS games lag behind in terms of overall visual fidelity. It obviously depends what type of game you're playing, something like a David Cage game or more story driven I prefer tight scenes with amazing detail.

But we can agree to disagree on peoples thought process with it. I truly believe a good chunk of those people have no idea of the concept of why lower FPS= cinematic.
 
I'm going to go ahead and be a snob then and say, yes, 24fps in games is complete bullshit and if you actually aim for that in development on modern console hardware you really, really suck.

Sure, it's a fair opinion. A bit abrasive and unconstructive but fair.

I'm more interested in the fact that aiming for 24fps allows a bit more rendering frame time for the same hardware. Theoretically it means dev can produce a game with photorealistic raytracing, really good motion blur, and possibly 4K, on the same locked hardware as we see in consoles these days. I think it's a bit pointless to aim for it on PC but it does make sense if you're heading towards the end of a console generation and really want wow factor.

It also depends on the game though. I'd hate for a 24fps action game, but if it's a game largely filled with cinematics or QTE sequences anyway, why not? I also wouldn't mind if it swaps between 60fps for gameplay and 24fps for cinematics like FMVs of the Playstation era.
 
I'm more interested in the fact that aiming for 24fps allows a bit more rendering frame time for the same hardware.
It doesn't sync up with a 60 Hz display. It would introduce constant judder.

1080p24 support varies in quality from display to display and couldn't be counted on. There are some displays that DO handle 24Hz content really well but this just wouldn't work for the majority of consumers and would create a bad experience for people.

It's simply a poor choice due to the way displays work.
 
I agree definitely that the 30FPS can be used to great effect. 60 FPS games lag behind in terms of overall visual fidelity. It obviously depends what type of game you're playing, something like a David Cage game or more story driven I prefer tight scenes with amazing detail.

But we can agree to disagree on peoples thought process with it. I truly believe a good chunk of those people have no idea of the concept of why lower FPS= cinematic.
Admitteldy I was thinking "75-90% here" for some reason, and I do think that at the very worst it'd be 50% as we're more informed on these issues than the average person.

The average person... fuck, you'd probably have to give a lecture on what frames per second even is and they'd just go "uhhh whatever" and completely fail to recognize a difference in front of their face.
Sure, it's a fair opinion. A bit abrasive and unconstructive but fair.

I'm more interested in the fact that aiming for 24fps allows a bit more rendering frame time for the same hardware. Theoretically it means dev can produce a game with photorealistic raytracing, really good motion blur, and possibly 4K, on the same locked hardware as we see in consoles these days. I think it's a bit pointless to aim for it on PC but it does make sense if you're heading towards the end of a console generation and really want wow factor.

It also depends on the game though. I'd hate for a 24fps action game, but if it's a game largely filled with cinematics or QTE sequences anyway, why not? I also wouldn't mind if it swaps between 60fps for gameplay and 24fps for cinematics like FMVs of the Playstation era.
Admittedly it's true for 4k purely because of HDMI limitations on consoles...

But besides that, the basic fact is that you're giving up relatively few frames for likely diminishing returns (I can't imagine you accomplishing much more than you already could at 30 fps) AND failing to be a number that 60 is a multiple of, which the above post covered well. I have to imagine even for an attempt at cinematic gaming whatever you could gain from being 24 isn't worth it for the hit in playability, doubly so if you can't make a good enough motion blur solution.
 
I understand in cutscenes and turn based games maybe.. but lower framerates are a bother when actually playing fast action games because your commands through a controller could be represented faster/better with higher framerates.
 
This discussion is the equivalent of arguing against evolution.

My thoughts exactly. Some people here sound like crazy creationists. Ignoring all arguments and science against their point of view and restating their babbling over and over again.
 
My thoughts exactly. Some people here sound like crazy creationists. Ignoring all arguments and science against their point of view and restating their babbling over and over again.

These comparisons are silly.

We're talking about art, not how many pixels we can cram on to a screen.

I'm in the camp that loves more and more frames for games (especially anything twitch based) as long as it doesn't compromise other aspects of the work.

I barely watch TV but for most good shows these days I think i'd love more frames too.

Movies are different though. Some films can/could benefit from more frames, but others would be garbage.

Tarantino talks about it more on the artistic side (digital vs film):
https://www.youtube.com/watch?v=BON9Ksn1PqI

I liken these arguments to saying painters should only do works digitally now because it's more efficient....
 
I must be the only person who want every game to look like soap opera. Fuck all those "filmic experience" in my video games.
 
But besides that, the basic fact is that you're giving up relatively few frames for likely diminishing returns (I can't imagine you accomplishing much more than you already could at 30 fps) AND failing to be a number that 60 is a multiple of, which the above post covered well. I have to imagine even for an attempt at cinematic gaming whatever you could gain from being 24 isn't worth it for the hit in playability, doubly so if you can't make a good enough motion blur solution.

I'm not disagreeing with you here, just thought-experimenting.

I'm ignorant of programming required but 24 fps vs 30fps is 25% extra time to do 'stuff'.

Maybe once upon a time when hardware could render only 10,000 polys per frame, 2,500 polys wouldn't seem like much. But say your hardware can render 1,000,000 polys. With 24fps you can render 250,000 more.

That 25% doesn't seem so insignificant, especially in a era where time-to-do-stuff is getting shorter and shorter.

I'm sure it's not so clear cut though and it'll be more like 15% increase in power.
 
These comparisons are silly.

We're talking about art, not how many pixels we can cram on to a screen.

....

Yes we are talking about art.
But we have established that the world "runs" in Planck units. At least 3 people have talked about the experiments with air force pilots and that they can "see" between 200 and 300 fps.
We have posters who brought up motion blur and information stored in movie frames which are not stored in computer game frames.
We know that 24 fps was the bare minimum for a some kind of fluid movement only acceptable with heavy motion blur and the way cameras work and only established because of technical and financial reasons.
All of this and we still have the old and tired nonsense stated by some people. Sorry but I think the comparison is not only true but fair.

I agree definitely that the 30FPS can be used to great effect. 60 FPS games lag behind in terms of overall visual fidelity. It obviously depends what type of game you're playing, something like a David Cage game or more story driven I prefer tight scenes with amazing detail.

But we can agree to disagree on peoples thought process with it. I truly believe a good chunk of those people have no idea of the concept of why lower FPS= cinematic.
 
I understand in cutscenes and turn based games maybe.. but lower framerates are a bother when actually playing fast action games because your commands through a controller could be represented faster/better with higher framerates.

I posted this in the op... the argument actually has nothing to do with how it affects the actual gameplay.

Those same "Joe Six Pack" people appreciate how fast and fluid the gameplay is in their Call of Duty.

They seem to enjoy that cinematic feel of halo as well... weird.
 
I for one would love to have 60 FPS movies. I can't fathom why people prefer the choppy 24FPS mess. The soap opera argument does not fly with me.

That said, I've never dropped a game for being 30 FPS. Racing games need 60 FPS for a proper sense of speed, while fighters need 60 FPS to keep input lag to a minimum, but they're still playable at 30.

But even 60 FPS need not be smooth. Right now I'm playing a title that is very jerky, yet the FPS is 60.

In any case, I prefer 60 to 30.
 
No that's not why people attack it though, and sure there may be a handful of people who do, but the point of this thread and why i brought this up is because the whole "LOL lower frame rate=film look gimme a break". I honestly believe that 75-90% of these people have no idea that movies play at 24P or even what that means.

Sure when a developer says they lowered the frame rate for a more cinematic feel it can be spin. However the people laughing it off as such, don't understand that they idea of lower framerates looking more cinematic is based on some fact here. If you put a 30FPS cutscene from TLOU vs. a 60FPS clip, you will see a difference, and based on what we are used to based on the industry standard, the 30FPS would appear more "film like" to the average person.

I posted this in the op... the argument actually has nothing to do with how it affects the actual gameplay.



They seem to enjoy that cinematic feel of halo as well... weird.


Halo will be 60 fps. The Last of Us will be 60 fps.
 
I posted this in the op... the argument actually has nothing to do with how it affects the actual gameplay.

People think 24 fps feels cinematic because that's what they're used to. If you let a group of kids grow up watching only 48 fps movies. Then when they hit 15 years, you show them a "cinematic" movie with 24 fps, they are gonna think it looks weird as hell.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

Exactly. Movies go through hours and hours of post-production and rendering, whereas with games everything is real time. It's why that 120 fps motion enhancement feature on many TVs looks like crap.
 
Well the filmic look is not going to work for Project Morpheus and Occulus. They've already stated they require 60fps and low latency.
 
Youtube videos are at 30FPS, including 'Lets Play' videos..

I've haven't seen a lot of people complain that watching youtube videos (at 30FPS is detrimental) of games being played, is actually bad.
 
I like 60FPS video. MGS Ground Zeroes does all its cutscenes at 60FPS, just like its gameplay, and it looks incredible. I think it's the best-looking current-gen game, because dat smoothness gives it something Infamous and Killzone can't hope to compete with.

I've never seen 48FPS at the cinema, though. I'd really like to, but I don't want to sit through three hours of Hobbit just to get a taste, y'know? I know it's going to be different to the regular experience, and I don't want someone to go back and interpolate another 36FPS into Citizen Kane or anything, but I'd love if it became an option in the future. I want to see The Raid 3 in high framerate.
 
Maybe someone already mentioned this, but note that films are filmed with film. Yeah, that sounds stupid, but remember that games are normally rendered once per frame, half the time without motion blur, the other half of the time with really bad fake motion blur. When you watch a film, it feels smooth because everything moves slowly and the motion blur creates the illusion of smooth motion. When you play a game without motion blur and everything naturally jerks around really fast, 24 FPS is not enough to make things look smooth.

...and when things don't move slowly, like in a fast scene with dynamic camera movement, everything looks like a confused mess (hence the heavy use of slow motion in action movies). I really hope we can move forward from 24 fps in movies, at least in certain genres.
 
No that's not why people attack it though, and sure there may be a handful of people who do, but the point of this thread and why i brought this up is because the whole "LOL lower frame rate=film look gimme a break". I honestly believe that 75-90% of these people have no idea that movies play at 24P or even what that means.

Sure when a developer says they lowered the frame rate for a more cinematic feel it can be spin. However the people laughing it off as such, don't understand that they idea of lower framerates looking more cinematic is based on some fact here. If you put a 30FPS cutscene from TLOU vs. a 60FPS clip, you will see a difference, and based on what we are used to based on the industry standard, the 30FPS would appear more "film like" to the average person.


Games aren't movies. In a movie the camera is often still, or slowly panning to compensate for the low frame rate. In a video game I want to be able to actually move the camera around and still have some sense of what's happening. 30fps games where you are controlling a camera all end up looking like the Blair Witch Project.
 
This is certainly a clever first post, but most monitors will be running at 60hz or 120hz, meaning the video as it appears will look even worse then what 24 fps actually looks like.

You either didn't know this and posted in hubris, or did, and maliciously posted it anyway.
Is there any other way to see gameplay in 24fps other than monitors(or TVs)?
 
I would prefer everything to be 60 fps, EVERYTHING. I saw a movie at 48p (the Hobbit) and it looked gorgeous, a lot better than 24p. Yes, 24p is the industry standard and it's what our eyes are adapted to, but i think it is time to go 48p/60fps on everything.
 
I get that 60fps plays better, and overall I prefer that definitely. When it comes to movies though, doesn't 60FPS make it look strange? Change the way it feels? Why can't this be said the same for games?

Not only does the feel of the fps change things, but also the layers of detail and effects they can add to the game leading to a more cinematic package.

I know I'm gonna get dumped on here, but I'm trying more to play devils advocate and ask why if it can work in movies, it can't in games?

edit- I've had to repeat this a ton of times in the thread, so I'm gonna try to clarify this here.

My argument is not necassarily on how the input feels, or how the game works as a whole, I'm talking about the cinematic look achieved with a lower frame rate. I know there is more to this than just lowering the frame rate, however there is a reason 24p has stuck as the industry standard. My point here is that saying 30FPS can have a more cinematic feel is not as laughable as some treat it at least IMO. This is not at all to say that 30FPS games play as great as 60FPS, it's simply stating the fact that it looks more like a movie

There is no "cinematic look" per se. It's just what you are used to see in the cinema. The decision to make films in 24p is not because it looks best for the medium, but because of cost and technical limitations. If people get used to HFR, that will be the "new cinematic look".

That is all your shitty "cinematic" look boils down to. What the tech was capable of and what people are used to.

And also, just for the record because some people actually forget that: Games is different from movies.
 
Youtube videos are at 30FPS, including 'Lets Play' videos..

I've haven't seen a lot of people complain that watching youtube videos (at 30FPS is detrimental) of games being played, is actually bad.
You obviously didn't see the thread about Gamersyde's new video hosting site.
 
So one thing I just don't understand about 24fps: why is it whenever I watch a movie, it looks very smooth, but whenever there's a game running at ~24fps it looks really choppy and feels like it's at least 10fps lower than the movie?
 
I think I actually got upset just from reading the OP.

That one-two punch of implied praise for games aspiring to be movies instead of games, plus the insistence of preserving archaic technical constraints as an artistic ideal.

Ugh.

Youtube videos are at 30FPS, including 'Lets Play' videos..

I've haven't seen a lot of people complain that watching youtube videos (at 30FPS is detrimental) of games being played, is actually bad.
I get pretty irritated when specific visual effects are butchered completely from having every other frame dropped. (See: Blinking invincibility effect in every retro game ever.)
 
So one thing I just don't understand about 24fps: why is it whenever I watch a movie, it looks very smooth, but whenever there's a game running at ~24fps it looks really choppy and feels like it's at least 10fps lower than the movie?

In a nutshell, it's motion blur. Camera lens introduce a ton of motion blur, and movies rendered in CG emulate the same effect with post processing. Games can do that too, but the effect isn't nearly as good (because games are rendered in real time).
 
I would prefer everything to be 60 fps, EVERYTHING. I saw a movie at 48p (the Hobbit) and it looked gorgeous, a lot better than 24p. Yes, 24p is the industry standard and it's what our eyes are adapted to, but i think it is time to go 48p/60fps on everything.

The Hobbit looked like shit in 48fps.
 
Top Bottom