In defense of the "filmic" look.

So one thing I just don't understand about 24fps: why is it whenever I watch a movie, it looks very smooth, but whenever there's a game running at ~24fps it looks really choppy and feels like it's at least 10fps lower than the movie?

This has been mentioned before. It's because filming real life gives you high quality motion blur. Motion blur in video games is substantially lower quality or absent all together.
 
Video games look nothing like films, they shouldn't try to be films either. It's a young, interactive medium, piling on baggage from a completely different industry like this is insane.
 
You obviously didn't see the thread about Gamersyde's new video hosting site.

Yes, I did.

You don't control the video do you?

Yes, you cannot control the motion of a character when you watch a video of a gameplay footage.

I was pointing towards having similar experience when you are playing a game at 30FPS, which is the same as watching a youtube 1080p@30fps video full screen on a 1920x1080 resolution display.
(all the while, playing the game will look and feel even better).

-----
Based on my personal experience:-

.if there are lots of details and many elements on screen, having the game run at 60FPS tends to distract me from the details/elements and instead just affix my eyes onto the motion ques (and/or just pure backend gameplay elements/mechanics).

.if there are less details and less elements on screen, then having the game run at 60FPS is great (because my emphasis would be to watch motion ques and adjust my gameplay with respect to interaction in the gameworld).
 
They seem to enjoy that cinematic feel of halo as well... weird.

I think COD is generally enjoyed more as a multiplayer experience while Halo is enjoyed more for its single player (and co-op) campaign. It is all about perception in delivering what the general audience wants.

People keep going back to COD because it does deliver that smooth and responsive feeling (more or less) online in a competitive environment. Multiplayer is what this series is most known for and the main reason why the games are designed around 60fps more or less. .

Halo on the other hand gets perceived more as a single player and co-op cinematic experience. This game series trades off framerate for visual eye candy to help create that experience. I think gamers are willing to forgive framerate drops in a series like this as long as it delivers what they are ultimately looking for.
 
The Hobbit looked like shit in 48fps.

This is so utterly stupid. Imagine the same game. Lets call it The Last of Us. Lets say it was released on PS3 and will be released on PS4. Lets say, part of that PS4 release will be 60 instead of 30 fps. Do you think this release will be a downgrade, weakening the premium cinematic experience, that only the 20-30 fps was able to deliver? Do you think this rerelease will look like shit?
 
Real life doesn't move in fast forward like 60fps looks

So your eyes go into overdrive just for 60fps games while reality around the screen is 30fps or lower ?

Look, I'm a firm believer of the 30fps+motion blur look for certain games, but this doesn't make any sense :P

If anything 60fps in movies looks weird because they are much closer to what real life looks like and you're much more likely to pick something that looks wrong or out of place.
30fps is like a screen between you and the content you're watching, it puts some distance between your eyes and the fiction while masking some fine detail.

The Hobbit certainly looked weird because our brain is conditioned by years of 24fps look for movies and 50/60fps for live reports and cheap soap operas, BUT also because the increased clarity of a smoother refresh makes it a lot harder to hide flaws in lighting, missing details in characters clothes, CGI etc.
48 fps made it a lot easier to tell what's real and what was built on set in the Hobbit.

This is especially true for videogames. 60fps increased perception of depth makes geometry stand out, it's much easier to spot polygonal edges and flat surfaces. increased clarity makes the image in motion appear sharper.

These can be seen as advantages or disadvantages depending on the result you're going after.

30 fps (especially with motion blur) help cheat your eyes and hide typical CGI flaws, so when going for filmic or even realistic look, it might be your best bet.

I myself prefer the look of 30fps+motion blur in some cases (and i absolutely hated 30fps games especially around Dreamcast era) and it's completely understandable that it would be an artistic choice.
 
The Hobbit looked like shit in 48fps.
It looked a lot better than it did in 24fps.
Pretty sure the problem for most viewers was the decision to have lower motion blur amount than people were used to. (24fps 180 degree == 48fps 360degree shutter.. hobbit used 48fps with a 270degree shutter.)
 
One and done. People who say they want games to look "cinematic" have no idea what they are talking about. What does that even mean? Cinema is not gaming. It's like saying, "I wish this book were more like a song."

Also, 24 fps captured on film is very different than 24fps digitally rendered. On film, you'll see that images are somewhat blurred (specially if the camera is moving fast), while on gaming 24 fps will looked chopped. We got so used to this cinematic feeling that when digital cameras where introduced making higher framerate and 'cleaner image' possible, we all felt the movie looked weird. I mean, the first time I saw a movie running natively on a 60 Hz hdtv it felt really weird.
 
''We're going for a filmic look'' is just another way of saying that they couldn't achieve 60fps. 60fps is always better for gaming.

not at all.
Just like 800p is not just another way of saying "we couldn't achieve 1080p".

Of course they are enjoying the benefits of having a lot less pixel per second to render, but it's perfectly understandable that it would be an aesthetic choice.
 
The Hobbit looked like shit in 48fps.

I adored it and wished I could watch HFR version of all my movies. Hopefully Cameron at least follows suit.

Of course it bums me out that I cant rewatch the hobbit because I'm unwilling to watch it at a slow framerate and there is no support for hfr at home.
 
Has been been a tryout for (something on the lines of) frame interpolation for video games running at low frame rates?
Would a real time implementation be possible (via software) ?
 
Has been been a tryout for (something on the lines of) frame interpolation for video games running at low frame rates?
Would a real time implementation be possible (via software) ?

Couple of issues with that:
- It will not get better responsiveness to inputs, which is something many people like about games running at higher framerates.
- It will, in contrary, introduce additional latency.
- Motion interpolation is not all that great and can lead to artefacts.
 
Those curious should give http://www.svp-team.com/ a try. This is frame interpolation software that works on any video content. The difference is very noticeable but it's up to you to decide whether the higher 'framerate' is worth it. I've tested it and used it a lot and ultimately decided that I prefer the 24 fps standard.

Results can be very impressive in anime too, but I decisively went back to standard 24fps.



Also, everyone who has seen both knows that subjective perception plays a basic, crucial, key role in this discussion.
 
24 FPS is acceptable if you have a 120Hz or 144Hz monitor.

On standard 60hz it looks like shit.
 
Has been been a tryout for (something on the lines of) frame interpolation for video games running at low frame rates?
Would a real time implementation be possible (via software) ?

This is from 2010
http://www.eurogamer.net/articles/digitalfoundry-force-unleashed-60fps-tech-article

(Possible less artefacts and even less lag than using motion blur in his case)

Some promising concepts but nothing ever came out of it and Lucas Arts is defunct now anyways (i think)
 
24 fps works for film only because the actual shutter speed can be set such that you get blur, blending frames together and making it appear smooth. If game makers want that "filmic" look at low fps, they will need to simulate this blur as well. Simply running at a lower fps without it will just make it look choppy.

I dislike low fps in games mainly because of responsiveness, but to each their own.
 
Has been been a tryout for (something on the lines of) frame interpolation for video games running at low frame rates?
Would a real time implementation be possible (via software) ?

some prototype by lucasart iirc.
I think it was demoed in force unleashed or something like that, but wasn't in the actual game.
It kinda worked in videos, but that won't change responsiveness and possibly introduce additional lag because of the time needed to process frames.

Either way, a frame interpolation 'device' or algorithm is the only way 60fps will ever become standard in games.

Edit: Beaten

24 fps works for film only because the actual shutter speed can be set such that you get blur, blending frames together and making it appear smooth. If game makers want that "filmic" look at low fps, they will need to simulate this blur as well. Simply running at a lower fps without it will just make it look choppy.

I guess they could have the game render internally at 60fps but output at 24hz, using the 36 discarded frames for motion blur purposes.

Btw i absolutely love the look of that 24fps hi quality motion blur portal 2 video (and the similar Sonic video posted a few days ago)
 
Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm
This is honestly the worst neogaf thread i've read in a while. Early on Dennis posted some WebM's supposedly displaying 24fps, everyone goes crazy without even thinking for a second and says that 24fps is terrible based on that.

Someone then correctly debunks dennis' WebM's, showing that they are actually running is a mismatched and choppy 12fps... everyone ignores that and dennis miraculously disappears from the thread.

My two cents, 24fps is most definitely noticeable when compared to 60fps but is it as unplayable as those frankensteinesque WebM's make it out to be? Hell no.
 
This is honestly the worst neogaf thread i've read in a while. Early on Dennis posted some WebM's supposedly displaying 24fps, everyone goes crazy without even thinking for a second and says that 24fps is terrible based on that.

Someone then correctly debunks dennis' WebM's, showing that they are actually running is a mismatched and choppy 12fps... everyone ignores that and dennis miraculously disappears from the thread.

My two cents, 24fps is most definitely noticeable when compared to 60fps but is it as unplayable as those frankensteinesque WebM's make it out to be? Hell no.

I agree, that WebM is a poor example.

Those curious should give http://www.svp-team.com/ a try. This is frame interpolation software that works on any video content. The difference is very noticeable but it's up to you to decide whether the higher 'framerate' is worth it. I've tested it and used it a lot and ultimately decided that I prefer the 24 fps standard.

Results can be very impressive in anime too, but I decisively went back to standard 24fps.



Also, everyone who has seen both knows that subjective perception plays a basic, crucial, key role in this discussion.

Thanks for this, didn't know it even existed. Also, you can achieve a similar effect with the "motion blur" option in KMPlayer. Just tried some films with SVP and it looks really interesting, will try it out more.

Anyway, I'd like to see a trend of higher framerate films, just to see if we'll all get used to it. Not really sure something like 48 fps is "more natural" since there is a lot of motion blur and after image with the naked eye anyway and stuff like fast shutter speed or very high framerate isn't always very realistic to what we actually see and perceive. It also depends of how illuminated the location is (night, day), peripheral vision, how trained our eyes are (normal guy as opposed to an airforce pilot), what we're currently focusing on or following with our eyes etc. All of these visual media are approximations and adapting moving images to be pleasing to the eyes.

Video games are a different type of medium than film, and the difference is going to be even bigger if VR catches on, at least in terms of how we're displaying the finished frames, with all sorts of effects that don't work well at all in VR. I don't think making "filmic games" is inherently bad since the medium allows for a lot of experimentation and freedom on how you present the game, but it shouldn't be the main or only direction, which is what a lot of games are mostly doing right now. I guess there'll be a time where everyone will grow tired of that trend along with sufficient technology changes to allow games to grow out of it. There will always be some focus on cinematic experiences but it might stop being the main one.

Hell, there are even so many film genres, techniques of filming, styles and whatnot and most games have only begun to imitate Hollywood action flicks.
 
Films run at 24 fps because that was the industry decided that was the minimum rate to fool your eyes. They didn't bother to make a better standard because it would cost more. So, this whole 24 fps cinematic motion is just a compromise that we got used to, not something that was considered the best for motion films or an artistic decision.

Both games and films should be better with higher frame rates. More frames = more visual information and more fluid motion. Simple as that. But games need this even more because you need responsiveness and you control the camera movement.
 
People are overthinking (or maybe underthinking) this entire situation, and it's kinda sad.

The situation is simple:

- Team wants game to have a filmic look.
- Films LITERALLY run at 24fps (in general). Team acknowledges this.
- Again, TEAM WANTS GAME TO HAVE A FILMIC LOOK. Read this line many times.
- Team decides in order for game to LOOK (keyword) closer to a film, choose most acceptable framerate closest to the 24fps that films use. This is 30fps.

NO WHERE do they say 30fps looks, feels, or plays better than 60fps. NO WHERE do they ever say they CONSIDERED running the game at 24fps.

It doesn't matter why films run at 24fps, nor do any technical details about that matter. The only thing that matters is the fact that films DO run at 24fps, and films...look filmic.

To say this is bullshit, is to say that films running at 24fps is bullshit, which would mean what you're saying is bullshit.
 
Here is Dark Souls II rendered, recorded, and played at 24fps
http://a.pomf.se/vndaxe.webm

It is a considerably different looking experience than yours.

I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.

I might be nitpicking but that's so obviously low framerate. I mean, I'd probably be able to play it but it would bug me very much during gameplay, it's jittering really badly.
 
60 fps only looks strange in movies because it is uncommon. That's literally it.

Yep. That's it.

And 24/30 fps shows look better than 30 fps games because of the actual motion blur. Simulation motion blur isn't there yet. But who cares when motion blur is just a mask for low quality.
 
People are overthinking (or maybe underthinking) this entire situation, and it's kinda sad.

The situation is simple:

- Team wants game to have a filmic look.
- Films LITERALLY run at 24fps (in general). Team acknowledges this.
- Again, TEAM WANTS GAME TO HAVE A FILMIC LOOK. Read this line many times.
- Team decides in order for game to LOOK (keyword) closer to a film, choose most acceptable framerate closest to the 24fps that films use. This is 30fps.

NO WHERE do they say 30fps looks, feels, or plays better than 60fps. NO WHERE do they ever say they CONSIDERED running the game at 24fps.

It doesn't matter why films run at 24fps, nor do any technical details about that matter. The only thing that matters is the fact that films DO run at 24fps, and films...look filmic.

To say this is bullshit, is to say that films running at 24fps is bullshit, which would mean what you're saying is bullshit.

The 24 fps standard goes back to 1926. It's bullshit that we've kept it this long.

60 fps is better in every situation always. It only looks "weird" because you're not used to it.

If you had grown up seeing everything in 60 fps and then someone showed you 24 and said "See, it's more filmic and cinematic, the blur makes it better." you'd call that bullshit. Arguing the opposite of that is nuts.
 
The funny thing is that there are a lot of games out there that probably run below 24fps while stating they are 30fps games. 24fps would probably be very playable.
 
Those curious should give http://www.svp-team.com/ a try. This is frame interpolation software that works on any video content. The difference is very noticeable but it's up to you to decide whether the higher 'framerate' is worth it. I've tested it and used it a lot and ultimately decided that I prefer the 24 fps standard.

Results can be very impressive in anime too, but I decisively went back to standard 24fps.



Also, everyone who has seen both knows that subjective perception plays a basic, crucial, key role in this discussion.

But those are fake extra frames. In film, higher frame rates get you more detail on anything in motion. Interpolation can't figure out all that extra detail.

It'd be as unfair as if I said '24 fps looks like this' and took the 48 fps version of The Hobbit and just dropped every other frame (and no, that's not how they arrived at the 24 fps version of the hobbit).
 
I also want to add that because the industry is so fixated on 24 fps film, I have to use SVP (Smooth Video Project) to watch my tv shows and movies at my screens refresh rate (75 Hz). Likewise I also use my rigs extra grunt to play games at frames much higher than 30 when I can.

But those are fake extra frames. In film, higher frame rates get you more detail on anything in motion. Interpolation can't figure out all that extra detail.

It'd be as unfair as if I said '24 fps looks like this' and took the 48 fps version of The Hobbit and just dropped every other frame (and no, that's not how they arrived at the 24 fps version of the hobbit).

Sure they are filler frames but it sure as hell looks better than the vanilla speed. There are also built in algorithms in the program that produces better frame blending than "120 Hz" tvs (assuming your rig is up to the task).
 
Video games look nothing like films, they shouldn't try to be films either. It's a young, interactive medium, piling on baggage from a completely different industry like this is insane.

100% agree, I'm glad not all devs think this way with regards to "filmic" games. The worst.
 
I also want to add that because the industry is so fixated on 24 fps film, I have to use SVP (Smooth Video Project) to watch my tv shows and movies at my screens refresh rate (75 Hz). Likewise I also use my rigs extra grunt to play games at frames much higher than 30 when I can.



Sure they are filler frames but it sure as hell looks better than the vanilla speed. There are also built in algorithms in the program that produces better frame blending than "120 Hz" tvs (assuming your rig is up to the task).

It's not a valid comparison though. You aren't seeing the full benefit of higher frame rates when you fake it. People look at interpolated stuff, and see the things that make it unusual, but many of the benefits they should be seeing aren't there.

People write off native higher frame rates based on interpolated stuff, and it's wrong wrong wrong.
 
Those curious should give http://www.svp-team.com/ a try. This is frame interpolation software that works on any video content. The difference is very noticeable but it's up to you to decide whether the higher 'framerate' is worth it. I've tested it and used it a lot and ultimately decided that I prefer the 24 fps standard.

Results can be very impressive in anime too, but I decisively went back to standard 24fps.



Also, everyone who has seen both knows that subjective perception plays a basic, crucial, key role in this discussion.
Uh what, you can't seriously think frame interpolation is even close to the real experience of higher framerate, it's just not simple the same. It works REALLY SHODDY.
 
I might be nitpicking but that's so obviously low framerate. I mean, I'd probably be able to play it but it would bug me very much during gameplay, it's jittering really badly.

It is absolutely noticeably low frame rate and far from 60fps. My point was to show that it was nowhere near as bad as Dennis's examples seemed to purport and that while definitely not ideal, it was still a playable game.
 
People are overthinking (or maybe underthinking) this entire situation, and it's kinda sad.

The situation is simple:

- Team wants game to have a filmic look.
- Films LITERALLY run at 24fps (in general). Team acknowledges this.
- Again, TEAM WANTS GAME TO HAVE A FILMIC LOOK. Read this line many times.
- Team decides in order for game to LOOK (keyword) closer to a film, choose most acceptable framerate closest to the 24fps that films use. This is 30fps.

NO WHERE do they say 30fps looks, feels, or plays better than 60fps. NO WHERE do they ever say they CONSIDERED running the game at 24fps.

It doesn't matter why films run at 24fps, nor do any technical details about that matter. The only thing that matters is the fact that films DO run at 24fps, and films...look filmic.

To say this is bullshit, is to say that films running at 24fps is bullshit, which would mean what you're saying is bullshit.

The 24fps subtle 'stutter' that is noticeable in film has the following repercussions:
- films are carefully shot, in many contexts fast movements and certainly particular speeds of camera panning are avoided to make sure the stutter is not really consciously noticeable
- the brain still perceives a certain level of stutter, making it feel less like you're looking through a window and more like you're watching a fast flipbook of photos, psychologically this creates a different effect that tends to work well for storytelling and this is the opposite of 'reality show' or news broadcasts where the 50 to 60 fps look tends give you a 'reality' feel rather than a 'storytelling' feel

One key thing you should realize from that is that the following is not true:
60 fps is better in every situation always. It only looks "weird" because you're not used to it.

I'm sorry. It simply isn't. In the same way you could argue that removing the 'black bars' from an extra-wide-angle film is 'always better' because you see more. However, there are stylistic reasons for this and there is also a level of subjectivity to this.

Now even though 24 fps most certainly has its place in cinematic storytelling, it is still generally pretty lousy for games. The effect of having framerates that are so low that a level of stuttering is actually noticeable in order to have a more storytelling-like effect just doesn't tend to benefit games much even from a psychological angle because you are not passively taking in a story, you are interacting with a world that needs to be responsive and immersive. I'm sure there are and will be games that perhaps use a 24fps/30fps look to their benefit but these will be extremely rare and probably more experimental. There just isn't much reason at all, generally speaking, to go below 60fps in games aside from being able to crank up the detail.

I'll probably sound cocky but this thread is so full of bs that I'm going to emphasize it: as someone who has tons of experience with real-time graphics, offline animation and real life cinematography and as someone who has specifically studied this very subject elaborately, I know what I'm talking about.
 
Youtube videos are at 30FPS, including 'Lets Play' videos..

I've haven't seen a lot of people complain that watching youtube videos (at 30FPS is detrimental) of games being played, is actually bad.

Lots of people complain about youtube's awful video quality.
 
Like, real life movement feel natural. It just happens. You don't notice a framerate.

24fps movies are like that. When people move it feels like how my eyes see real people.

60fps is more like "look at me I'm sooo smooth!" Feels artificial

I thought you were joking until this post. Yeah. I don't think you're gonna have a lot of company here with that opinion.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

Doesn't this prove that 24fps, provided hardware is powerful enough, ironically, would be ideal from a presentation standpoint? As someone who's a staunch supporter of more 60fps games, I can't deny that looks way more impressive than a game running at 60fps. This looks like evidence in support of 24fps, not against it.
 
People are overthinking (or maybe underthinking) this entire situation, and it's kinda sad.

The situation is simple:

- Team wants game to have a filmic look.
- Films LITERALLY run at 24fps (in general). Team acknowledges this.
- Again, TEAM WANTS GAME TO HAVE A FILMIC LOOK. Read this line many times.
- Team decides in order for game to LOOK (keyword) closer to a film, choose most acceptable framerate closest to the 24fps that films use. This is 30fps.

NO WHERE do they say 30fps looks, feels, or plays better than 60fps. NO WHERE do they ever say they CONSIDERED running the game at 24fps.

It doesn't matter why films run at 24fps, nor do any technical details about that matter. The only thing that matters is the fact that films DO run at 24fps, and films...look filmic.

To say this is bullshit, is to say that films running at 24fps is bullshit, which would mean what you're saying is bullshit.

24 fps doesn't define a filmic look, as much as the premium cinematic experience 30 fps.
And, you can quote me on this, Avatar 2, with 48-60 fps will more look a movie, than The Order could ever dream of.
 
Erm. No. You'd still be getting motion judder from showing one frame three times and the next frame two times.

If you didn't do that, you'd have constant tearing, which would be just as bad.

The amount of wrongness in this thread is jaw dropping.

That'd be a very "cheap" way to convert 24fps to 60fps.

It works much better to mix frames. For example frame A displays twice, then a frame of AB mixed is displayed, then frame B is displayed twice. Each frame is displayed for an identical length of time.

Judder is actually fine though, at least with regards to visuals. The whole point of dropping to 24fps is to replicate the filmic experience. It's sacrificing image quality in order to achieve a visual effect. Since judder is a byproduct of 24fps that affects movies as well, having minimal amounts of it seems like a good thing.
 
30fps
30fps frame interpolated to 60fps with SVP
60fps

Frame interpolation works pretty decent when it comes to games like this, you can clearly see artifacts around the knife and around most objects.

In real-life scenarios if you plan to frame interpolate during actual gameplay, you probably wouldn't be able to do it, would be extremely unplayable and high latency.
(and I bet the processing power for that would more expenisve than just rendering the game in a couple of frames more)

I don't really like it but it's pretty interesting.
 
Top Bottom