In defense of the "filmic" look.

It has only become passable because of motion blur. You can't have filmic look without motion blur.

Movies can have perfect motion blur. It means that not a single frame is a still frame. Every frame is a delta of many frames. So you could say (almost) that there are much more than 24 frames per second of information in there. A frame contains movement.
That's not possible in a realtime engine unless you render everything with a hefty (and horrible) delay.
That's why games NEED higher fps than movies. Case closed.

Leave this thread, watch this GameSpot video, then come back.

In short one of the biggest differences is motion blur for movies sort of smoothing it out for our brains, video games going for 24 fps without faking the motion blur, well, watch that Watch_Dogs webm, and there's Ocarina of Time on the N64 for I THINK 20 fps.

Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.

The way that films are shot, edited, and touched up is completely different from the way games are rendered. Even motion blur was traditionally achieved in movies from the exposure of films. The blur appeared as a natural product of object or scene motion captured into one frame. Blur in games fundamentally doesn't work that way. It's simulated, just like everything else, and at this current generation the blur isn't anywhere near as natural looking as it does in film.

You want a game to "look cinematic", I expect some serious efforts in the technical side of actually making a game look cinematic. I expect pristine, clean image quality, blur and depth of field that looks indistinguishable form its film counterparts, and a complete lack of loading screens so the story misses no beats.

Oh but that's not what we're getting with these "cinematic" games, are we? We still get shitty blur, jaggies out the ass, poorly segmented sequences or excessive loading times that, if they ever appeared in any movie, people would just walk out the theater.

"My game is cinematic because it runs at 30 FPS" is the biggest load of horseshit I've ever heard to justify the framerate. That's not how a film works. I can't take a fucking iPhone camera, chop everything to look 24 FPS, then claim a video of my dog skateboarding "makes it look cinematic". Any developer making such a boneheaded argument is blowing smoke up your ass.

EDIT: I want to make it clear I think 30 FPS is still perfectly playable, but it's objectively inferior to 60 FPS in any game. 30 FPS has never been, and never will be, a conscious artistic decision by any developer. It is always, always a technical one that results in the developer trading off framerate for better visuals.
Just quoting some people that get it.

Anyone talking just framerates is missing half of the "cinematic" equation, the other half being motion blur. Yet these threads are always full of people talking only about framerates. Get it through your damn skulls: you can't compare framerate in films and games, without talking about motion blur.
 
His post insinuated COD=60fps and sales great, therefore 60FPS= better sales

I flipped it with Halo= 30fps therefore fps doesn't matter.

Which is the bigger frachise?

Also everyone was talking about how CoD needs 60 fps to retain its gameplay feel and input.

Do you want to show evidence of anyone besides you talking about sales?
 
His post insinuated COD=60fps and sales great, therefore 60FPS= better sales

I flipped it with Halo= 30fps therefore fps doesn't matter.

Yeah, I didn't insinuate that.

What I said is that 60 FPS matters to some developers to deliver the experience that they want to deliver, part of a package that includes exceptional responsiveness during multiplayer matches, in part delivered by the framerate. And it's noticeable even by the layperson who doesn't know the technical details.

I never said that a high framerate directly leads to better sales in a vacuum.
 
I can argue that, but wouldn't you agree the majority of us grew up with 24 as the standard? So in the eyes of a good chunk of people including developers, 30fps would have a more cinematic feel. It's a simple concept to understand, I don't see why it's so laughable at all.

Oh, I definitely agree that this explains the phenomena, I'm just not sure I'd consider it a particularly good justification. "It's what I'm used to" isn't likely to convince many people who disagree of the validity of your position.

If you're just saying "I grew up with this and it's what I'm used to so I prefer it," then sure, that's absolutely fine.
 
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.

Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?
 
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.

Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why doen't we see any half-step compromises like a locked 40 or 50 or what have you?

I think its because of the monitors refresh rate. But I cant explain that technically as some would do.
 
I don't think thats how that works.

Which halo are you talking about runs at 30?

Maybe I am wrong, haven't played halo in a bit, but the newer ones or new-ish ones run at 60

AFAIK the majority of the games are 30FPS. Reach may be 60 though? Not positive on that.

It's about the feel of 60 fps. It's about the feel of 24 fps. It is more like real life. I don't understand any of this, and it is very difficult to not think of these terms as people just trolling other people. What exactly is this "feel" of 24 frame rates per second? Is the feel of 24 fps is similar to 30 fps?

I did not want to post in this thread, but I think it is ridiculous that people are attempting to defend that 24 fps is good because it has the feel of 30 or 60 fps, and it is similar to real life.

You're lack of knowledge on the subject is exactly what I'm referring to. You can't participate in this argument, you don't even have a simple understanding of what any of it means. Try googling the "film look" and you may learn a thing or two.

Which is the bigger frachise?

Also everyone was talking about how CoD needs 60 fps to retain its gameplay feel and input.

Do you want to show evidence of anyone besides you talking about sales?

You mentioned "masses of people" don't care or whatever. I debunked that those same masses of people care at all showing that massive franchises have succeeded being both 30FPS and 60FPS.

Yeah, I didn't insinuate that.

What I said is that 60 FPS matters to some developers to deliver the experience that they want to deliver, part of a package that includes exceptional responsiveness during multiplayer matches, in part delivered by the framerate. And it's noticeable even by the layperson who doesn't know the technical details.

I never said that a high framerate directly leads to better sales in a vacuum.

But it's what was implied, see post above. Besides I'll say it again I'm not arguing about 30fps input vs. 60 fps benefits, pros cons, etc. I'm arguing in defense of the term "filmic look" or cinematic feel, or whatever else people toss around, and simply laugh at the notion. Film look is a very real thing.

Oh, I definitely agree that this explains the phenomena, I'm just not sure I'd consider it a particularly good justification. "It's what I'm used to" isn't likely to convince many people who disagree of the validity of your position.

If you're just saying "I grew up with this and it's what I'm used to so I prefer it," then sure, that's absolutely fine.

Again I think you're missing the point. I'm arguing in defense of the terms used to describe it. It's there and it's very real, for whatever reason- we're used to it, the type of motion is effective, same reason diferent speeds of film different angles, etc. have an effect on the emotion.
 
I like being unable to tell what framerate something runs at. Being able to notice minute compression and analogue artefacts in video is bad enough. Fast movies don't bother me, nor do slow games, unless they dip below 20 or so. Still, all things being equal, a higher frame rate should never hurt.

Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm

Not, it isn't. The video runs at a nominal 24fps, but if you framestep through it, you will see loads of duplicate frames. In the first second (24 frames), there are actually 12 unique (the others are still slightly different—compression artefacts) frames. Those 12 frames are not equidistant, making it look even jerkier. This is not an accurate representation of a game running at 24fps.

Number of times each frame occurs during the first second: 1, 2, 3, 1, 1, 3, 1, 4, 1, 1, 5, 1

That's right, it's showing that frame with 4 dups for more than 208ms.


This one is running at 12fps too. Number of occurrences: 2, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3

This is very odd. Even when recording at a higher frame rate, a down-conversion shouldn't lead to exact duplicate frames. An up-conversion from the lower framerate? Possibly. But even in that case, the pattern should be regular.

Maybe this could lead to that effect: Clamp the game to run at 24fps, capture at a higher rate and then down-convert to 24fps.

TV (which I think runs at 50Hz).

If you live in PAL land. If you live in NTSC land it runs at 60Hz.

Edit: Added some more thoughts.
 
Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm

Here is another example of 24 fps gameplay

http://a.pomf.se/plweeh.webm


Here is Dark Souls II rendered, recorded, and played at 24fps
http://a.pomf.se/vndaxe.webm

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
 
Framerates around 24fps have noticeable stuttering even if only on a somewhat subconscious level when it comes to movies most of the time where we tend to only notice stuttering when the camera pans at a certain speed (which is why they avoid that). Thing is, for storytelling, this super-subtle stuttering actually has a particular psychological effect that works well for cinematic purposes. It's almost like you're aware that you're watching a fast photo flipbook telling you a story. Speed things up to 60fps and this effect is lost, giving you the 'reality tv' feel. Neither is better than the other but they both serve different purposes.

This is my own theory on the subject matter and I strongly believe in it from all the experience I've built up with both CG, real-time graphics and real-life cinematography.
 
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.

Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?

Because of monitor refresh rates being multiples of 2 from 30hz usually.

Hence why things like Gsync and Free sync are awesome.
 
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.

Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?

Televisions and Monitors typically refresh at 60Hz. When running at 45 fps some frames will be displayed longer than others causing judder or a frame will change while it's being displayed causing tearing. Gsync is meant to dynamically adjust refresh rates to match the framerate to eliminate the tearing and judder at those in between framerates.
 
Here is Dark Souls II rendered, recorded, and played at 24fps.

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.

Nailed it. 24 fps is not ideal... sure! But they can be completely playable and are far from those WD captures. I wouldnt be so amazed by Shadow of the Colossus and D. Souls otherwise.

And damn! How I would like to see Dark Souls 2 releasing on PS4 / Xbox One.
 
The problem I have with these webms posted is that the game itself isn't designed to run at that low a framerate, so it will look worse than one which was intended to. Motion blur and the like will be finely tuned around that target.

Needs motion blur.

Think you should redo this. Doesn't look like you're using motion blur either. I think it would help at a lower frame rate. I locked mine at 24 and it didn't seem that choppy. I even ran around, drove around, shot the place up. Was decent enough, not really playable for a whole though.

The video even hitches around 11-12 seconds.

Yes. But in your example there is no motion blur.

As I stated in another thread, motion blur should not be used as a band aid to cover up a game unsteadily juddering along at 20fps - 30fps. That crap needs to stop.

This.. people ways forget that 24fps only looks good cause of motionblur. Take that out movies would look like shit.

Thank You.
 
The conditions that a camera captures 24fps and a computer producing one is under entirely different circumstances. Why not greater than 30fps with post-processing to emulate all that motion blur inherent to a camera. Think like an animated film. If they just rendered enough frames to do their length at 24fps but not add anything like motion blur and whatever, it would look choppy and wouldn't look like a film people are used to seeing. Cameras don't get the luxury of that video games do where all the information it needs to draw is given to them rather than a camera having to capture it.

All this filmic framerate stuff just sounds like spin to make something average seem like its special.
 
I'd have zero problems if a game was 24fps, as long as there was an artistic reason behind it. Replicating the "filmic" experience would count in my opinion.

To me it's no different than making a game black and white, with pixelart graphics, 2.4:1 aspect ratio, etc. The artistic vision shouldn't be restrained just because it decreases the image quality.


If however a game just ran bad and hovered at an unlocked 24fps, then that'd rightfully deserve all the hate it'd get.
 
It's about the feel of 60 fps. It's about the feel of 24 fps. It is more like real life. I don't understand any of this, and it is very difficult to not think of these terms as people just trolling other people. What exactly is this "feel" of 24 frame rates per second? Is the feel of 24 fps is similar to 30 fps?

I did not want to post in this thread, but I think it is ridiculous that people are attempting to defend that 24 fps is good because it has the feel of 30 or 60 fps, and it is similar to real life.

We all thought The Hobbit felt weird in 48fps, so I assume people are applying that same feeling to the idea of 60fps video games. Eraserhead 'feels' better on an old VHS compared to 4K/60fps, that kind of thing. It's not too stupid of a concept, there's just no objectiveness to it.
 
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.

Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?
I think it has to do with how much the tvs refresh per second. A normal one refreshes 60 times per second (60Hz) so games with 60 fps show exactly 60 different photograms per second while 30 fps games show just 30 of them.
 
Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.
Guess what, video games aren't going to employ those film techniques either, so that's a moot point.
 
Televisions and Monitors typically refresh at 60Hz. When running at 45 fps some frames will be displayed longer than others causing judder or a frame will change while it's being displayed causing tearing. Gsync is meant to dynamically adjust refresh rates to match the framerate to eliminate the tearing and judder at those in between framerates.

Shame that GSYNC is nvidia based and won't make it on to consoles as long as they use amd chips. really pissed that both nvidia and intel have shitty relationships to sony, ms, and nintendo despite the fact they are the best in the industry for performance or features.
 
This is bullshit. Please do not post this misinformed stuff, it furthers incorrect biases on gaf.

Yeah "bullshit"
30fpsconsolepcnupw4.png
 
Here is Dark Souls II rendered, recorded, and played at 24fps.

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.

Darksouls has a perobject motionblur. It appears that Watchdogs really does not.

Yeah "bullshit"
30fpsconsolepcnupw4.png

There are google searches for "how is babby formed?"

you point is?
 
I'd have zero problems if a game was 24fps, as long as there was an artistic reason behind it. Replicating the "filmic" experience would count in my opinion.

To me it's no different than making a game black and white, with pixelart graphics, 2.4:1 aspect ratio, etc. The artistic vision shouldn't be restrained just because it decreases the image quality.


If however a game just ran bad and hovered at an unlocked 24fps, then that'd rightfully deserve all the hate it'd get.

3:2 pulldown sucks too much on too many TVs to go with 24fps over 30fps.
 
Like, real life movement feel natural. It just happens. You don't notice a framerate.

24fps movies are like that. When people move it feels like how my eyes see real people.

60fps is more like "look at me I'm sooo smooth!" Feels artificial
After you use SVP for all of your videos like I do, 24 fps will feel like laggy shit. Plus there was an article somewhere where the people said that 72 fps induced the most emotion to the audience.
 
I'm looking and can't find a 60 vs 24fps movie/tv show clip. Are there any out there?

To me a movie/tv show looks pretty terrible at 60fps (I see the soap opera/fast forward/cheap cam effect) whereas a game is better at 60fps (though I'm ok with a 30fps game and can't visually see the difference).
 
The way that films are shot, edited, and touched up is completely different from the way games are rendered. Even motion blur was traditionally achieved in movies from the exposure of films. The blur appeared as a natural product of object or scene motion captured into one frame. Blur in games fundamentally doesn't work that way. It's simulated, just like everything else, and at this current generation the blur isn't anywhere near as natural looking as it does in film.

You want a game to "look cinematic", I expect some serious efforts in the technical side of actually making a game look cinematic. I expect pristine, clean image quality, blur and depth of field that looks indistinguishable form its film counterparts, and a complete lack of loading screens so the story misses no beats.

Oh but that's not what we're getting with these "cinematic" games, are we? We still get shitty blur, jaggies out the ass, poorly segmented sequences or excessive loading times that, if they ever appeared in any movie, people would just walk out the theater.

"My game is cinematic because it runs at 30 FPS" is the biggest load of horseshit I've ever heard to justify the framerate. That's not how a film works. I can't take a fucking iPhone camera, chop everything to look 24 FPS, then claim a video of my dog skateboarding "makes it look cinematic". Any developer making such a boneheaded argument is blowing smoke up your ass.

EDIT: I want to make it clear I think 30 FPS is still perfectly playable, but it's objectively inferior to 60 FPS in any game. 30 FPS has never been, and never will be, a conscious artistic decision by any developer. It is always, always a technical one that results in the developer trading off framerate for better visuals.

I think it is partly a artistic choice. As much as huge parts of their audience, they don't know it better.
I'm looking and can't find a 60 vs 24fps movie/tv show clip. Are there any out there?

To me a movie/tv show looks pretty terrible at 60fps (I see the soap opera/fast forward/cheap cam effect) whereas a game is better at 60fps (though I'm ok with a 30fps game and can't visually see the difference).

The soap opera look is due to lighting, for fucks sake.
 
How many fps does real life run in?

Obviously, we can calculate our brain's fps through frame times. You simply divide 1 second (or 1000ms) by the time it takes for our brain to render a frame--i.e. our brain's "frame latency" (about 100ms). 1000/100 = 10, so, we see at 10fps.

I don't see what all you frame rate junkies are on about; 30fps, hell, even 24fps is more than enough.

Yes, I am joking :P
 
Here is Dark Souls II rendered, recorded, and played at 24fps.

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.

Looks better but there's still detectable jitter that just drives my brain crazy.

I will gladly sacrifice a few things to keep a steady 60FPS+, doesn't matter what genre the game is.
 
Obviously, we can calculate our brain's fps through frame times. You simply divide 1 second (or 1000ms) by the time it takes for our brain to render a frame--i.e. our brain's "frame latency" (about 100ms). 1000/100 = 10, so, we see at 10fps.

I don't see what all you frame rate junkies are on about; 30fps, hell, even 24fps is more than enough.

Yes, I am joking :P

10fps? Nah, you've gotta double that at least. Nyquist rate and all that.

;)
 
Motion blur is a standard visual effect in videogames. Done correctly, it drastically improves the apparent fluidity of motion.

Yes, and its done for games currently at 30/60.

You know what else will help fluidity? Higher framerates for you games.
 
Yeah "bullshit"
TVS. AND. MONITORS. THE DIFFERENCE IS IN THE DISPLAY NOT THE SYSTEM.

I posted on that earlier, and I've actively tried hooking a computer up to a TV before to see how it'd look. That's where the difference actually is. Consoles are just specialized computers anyway.
Here is Dark Souls II rendered, recorded, and played at 24fps.

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.
 
That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.

As the person playing, it was absolutely playable. Not as great as 60fps of course, but I was able to play. I was even able to parry an enemy, and dodge through the turtle enemies swings.

It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
 
I think it is partly a artistic choice. As much as huge parts of their audience, they don't know it better.


The soap opera look is due to lightning, for fucks sake.

I assume you mean lighting? And I don't know anything about it really, but am pretty sure that isn't the case. If it were, something at 24fps with the "wrong lighting" would have the soap opera effect as well, no?
 
Top Bottom