In defense of the "filmic" look.

Fuck cut scenes, we're playing video games not movies. Leave cut scenes and movies out of the discussion altogether.

Also you will never get the same effect as films because one is still recording real life and the other is using processing power to recreate that effect.

The artificial way is always going to fall flat in the face of the real deal, so tryin to copy that is doing games a disservice.
 
Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm

That cinematic look....


But seriously, didn't movie projectors before going digital use some sort of shutter effect to show each frame twice and make it look smother than what it actually was? When film makers decided on 24FPS for movies, it was the lowest framerate they could get away with while still making the movie look presentable. IT was done to economize the cost of the film reels that would be distributed out to cinemas.
 
This.

Maybe some day people will understand that fps doesnt have that much impact on the quality of a game.
It's all case by case, but I do think it shouldn't be acceptable to dip under 30 fps regularly at this point. We have hardware that can make amazing looking games at 60 fps, if you can't adequately mask or otherwise justify going below 30 fps then don't do it, this isn't the PS1 or N64 where to make a game with any scope it's kind of necessary, now that's mainly just the domain of very large and open games.
 
The one thing I would believe is that higher framerates expose unnatural things more easily because the brain has more information to work with.

i.e. at a lower frame rate your mind fills in gaps. In the same way you look at a rough sketch and can see a face in it. But the more refined the image, the easier it is for things to look wrong - like badly proportioned nose and ears.

I believe a fair few film critics have remarked using the Hobbit as an example, that higher frame rates make sets look more like sets because the seams are showing and the lack of realistic detail is more apparent. In games, higher framerates can expose lacking character animation, especially in cut scenes that try to be cinematic.

It all relates back to the ancient challenge of trying to fool the human brain with art, since the brain is evolved to experience reality in a particular way through the senses.
 
30FPS is bullshit born of devs wanting to implement more eye candy than would be possible at 60FPS.

That is literally all there is to it.
 
I get that 60fps plays better, and overall I prefer that definitely. When it comes to movies though, doesn't 60FPS make it look strange? Change the way it feels? Why can't this be said the same for games?

Not only does the feel of the fps change things, but also the layers of detail and effects they can add to the game leading to a more cinematic package.

I know I'm gonna get dumped on here, but I'm trying more to play devils advocate and ask why if it can work in movies, it can't in games?

I think the counter argument would essentially be this: the way something "feels" is subjective and almost certainly tied to what we grew up with. We've always seen movies in 24FPS, so now it "feels" right to us.

Objectively, however, 60 FPS simply provides more information with absolutely no downside to the viewer. I can imagine some specific, artistic uses of lower frame rates (these already exist, as some shots are done at even 1-10 FPS to illicit some specific response, just as black and white is still used over color in some films), but as a default or standard there is no objective argument for 24 FPS.
 
I would argue the opposite - how about movies all embrace 60fps and be glorious and beautiful? :D Or at least 30fps. 24fps is such an odd number.
 
The one thing I would believe is that higher framerates expose unnatural things more easily because the brain has more information to work with.

i.e. at a lower frame rate your mind fills in gaps. In the same way you look at a rough sketch and can see a face in it. But the more refined the image, the easier it is for things to look wrong - like badly proportioned nose and ears.

I believe a fair few film critics have remarked using the Hobbit as an example, that higher frame rates make sets look more like sets because the seams are showing and the lack of realistic detail is more apparent. In games, higher framerates can expose lacking character animation, especially in cut scenes that try to be cinematic.

It all relates back to the ancient challenge of trying to fool the human brain with art, since the brain is evolved to experience reality in a particular way through the senses.

Excellent post in showing or describing various issues at play.

I just looked up from my monitor and took a look at life. It's running at, maybe, 30fps. There were some drops to 24 though.

Hoping this is a joke.
 
This.

Maybe some day people will understand that fps doesnt have that much impact on the quality of a game.

Is that why games like Call of Duty continue to be 60 FPS? Because it makes no impact and the masses of people who buy them can't tell the difference and/or don't care?
 
Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm

Wtf....gif


Oh my god, yuck. Fuck that shit. 60fps or bust.
 
Is that why games like Call of Duty continue to be 60 FPS? Because it makes no impact and the masses of people who buy them can't tell the difference and/or don't care?

Well obviously Call of Duty is low-brow drivel for the masses, just like soap operas.
Meanwhile the crème de la crème of video gaming entertainment - Naughty Dogs' The Last of Us - runs at sub-30 frames per second, just like the Citizen Kane of movies - Citizen Kane.
 
This.

Maybe some day people will understand that fps doesnt have that much impact on the quality of a game.

It has plenty of impact on the quality of a game, it's just in a lot cases, there's no choice. A game can be good at 30 fps but I'm firmly in the belief that virtually every game is better at a higher framerate.
 
Higher framerate is always, always preferable in games. It's not just about visuals, it's about the fact that controls are more responsive, as the game is reading control inputs at a quicker rate. Sure, games can be perfectly fine at 30fps, but no game will ever be worse off for having a higher framerate.

The sad thing is, we used to get plenty of games at 60fps back in the Xbox/Gamecube days, as well as in the SNES and Mega Drive era.



Round-5-Danny-Amendola.gif


You need to get yourself to an optician son, because something is not right there.

This goes back to my original post... I'm arguing feel here, not how the game controls.

I agree with you OP, fine with 30FPS myself when it comes to the filmic look.

Glad RAD went with 30FPS.

Honestly it's not even about RAD or the order, I've seen this argument so many times and it's always just laughed at. The notion of someone trying to achieve 30FPS or 24FPS is laughed at on this forum. I'm sure a lot of people understand that's due to input, but I wonder how many of these people realize that 24P is a premium feature on cameras, and is one of the major components of the "film look"

Yeah fuck that. That looks choppy and...well, bad.

This webm keeps getting play, when that's not even how a game would be rendered at 24fps. Besides have you played Halo? Resistance? Uncharted? Gears of War? They all look damn good.

I think the counter argument would essentially be this: the way something "feels" is subjective and almost certainly tied to what we grew up with. We've always seen movies in 24FPS, so now it "feels" right to us.

Objectively, however, 60 FPS simply provides more information with absolutely no downside to the viewer. I can imagine some specific, artistic uses of lower frame rates (these already exist, as some shots are done at even 1-10 FPS to illicit some specific response, just as black and white is still used over color in some films), but as a default or standard there is no objective argument for 24 FPS.

I can argue that, but wouldn't you agree the majority of us grew up with 24 as the standard? So in the eyes of a good chunk of people including developers, 30fps would have a more cinematic feel. It's a simple concept to understand, I don't see why it's so laughable at all.
 
It's all case by case, but I do think it shouldn't be acceptable to dip under 30 fps regularly at this point. We have hardware that can make amazing looking games at 60 fps, if you can't adequately mask or otherwise justify going below 30 fps then don't do it, this isn't the PS1 or N64 where to make a game with any scope it's kind of necessary, now that's mainly just the domain of very large and open games.

Not below 30 anymore. At this point I can agree... every developer should look for stable 30fps minimum. But if most of them opt for releasing games at 30 fps (stable) im perfectly fine to use the rest of the performance to upgrade the game world. Since this is all about The Order, lets just use that as an example.

zsnime.gif


This game has an impressive level of details, and even targeting 30 it looks PERFECTLY playable (very distant from that WD gif at 24fps), I dont mind this being 30 at all. Not really! And that said, I wouldnt sacrifice anything of the visuals there, that seems so atmospheric and unique, just to run it at smooth 60 frames per second.

And thats what the dev said I believe. He wanted that immersion, and he wouldnt reach that at 60fps, so he chosed to let it on 30fps and it would feel more "filmic" then at 60fps and visually downgraded. But he also wouldnt sacrifice it to 24fps to improve it more graphically, because then it would really hurt the gameplay.
 
I remember once i saw a comparisons video of a racing game , I think it was Forza Horizon. The comparison was between 24 FPS,30 FPS and 60 FPS. I remember thinking that the 60 FPS was really nice and smooth. And 24 look very "cinematic" , the 30 FPS look like the "compromise" , if i make myself understood.

If a game was built around 24 FPS i think it could work.
 
24fps Stockholm Syndrome is a terrible thing indeed.

Well, at least one can have a laugh or two with the silly rationalizations. Silver linings.
 
I just looked up from my monitor and took a look at life. It's running at, maybe, 30fps. There were some drops to 24 though.
You're supposed to be taking Oculus off.

You're also supposed to be running that at 60 fps rather than capped at 30 on underpowered hardware.
 
Not below 30 anymore. At this point I can agree... every developer should look for stable 30fps minimum. But if most of them opt for releasing games at 30 fps (stable) im perfectly fine to use the rest of the performance to upgrade the game world. Since this is all about The Order, lets just use that as an example.

zsnime.gif


This game has an impressive level of details, and even targeting 30 it looks PERFECTLY playable (very distant from that WD gif at 24fps), I dont mind this being 30 at all. Not really! And that said, I wouldnt sacrifice anything of the visuals there, that seems so atmospheric and unique, just to run it at smooth 60 frames per second.

And thats what the dev said I believe. He wanted that immersion, and he wouldnt reach that at 60fps, so he chosd to let it on 30fps and it would feel more "filmic" then at 60fps and downgraded. But he also wouldnt sacrifice it to 24fps to improve it more graphically, because then it would really hurt the gameplay.

No 24FPS is not about better graphics, it's about the feel, this is exactly what I'm talking about, people are so damn misinformed when it comes to this stuff. Yes most of the time when a dev goes for 30fps they are going for visuals... however when they say "filmic look", they're talking about how it looks in motion, similar to the soap opera vs. movie argument.
 
Because 60 fps is already a slideshow, 120 fps is where it at. If u wanna play a game at 30 fps, might as well go to a museum and look at paintings.
....

I'm going to assume you say this in jest. You know, kind as if you are having an only semi-serious argument with a good friend of yours and you want to shoot down his/her opinion is a funny way.
 
Real life has motion blur and grain and a host of other imperfections. 60fps looks unnatuial because of this. It's not just because we haven't got used to it as sports programs have been broadcasting in this format for decades. It looks cheap and always will do.
 
zsnime.gif


I've seen that .gif quite a bit in the last few hours. A little off topic, the reticle disappears while in bullet time, is the player aiming at that point?
 
"Life runs at xyz framerate"...Staaaap. Things consist of waves, and they work on a continuum. Of course that may be different than how the brain processes them, i.e. after a certain number of frames our brain can't tell the difference, but even then it's way beyond 60fps. I'll put it this way, I can tell the difference between 120 and 240 frames.

I remember watching making of The Hobbit and one of the technical directors talking about the 3D cameras even said the limit of the human eye was 60fps. I guess professionals talk out their asses too.

Real life has motion blur and grain and a host of other imperfections. 60fps looks unnatuial because of this. It's not just because we haven't got used to it as sports programs have been broadcasting in this format for decades. It looks cheap and always will do.

I see this all the time too. If this is the case, then shouldn't your eyes blur the motion in the game? Since you know, looking at a video game is still "real life".
 
Here is a game running at 24 fps:

http://a.pomf.se/exqckv.webm

Here is another example of 24 fps gameplay

http://a.pomf.se/plweeh.webm


These don't look to be running 24fps at all.

I did my own test of a game rendered, recorded, and played at 24fps.

It was completely playable and not anywhere near as stuttery as yours appear to be.

Maybe you recorded at 60 and exported at 24fps, but that's a totally different experience.

I've got a video rendering out right now that I'll post as soon as it's done, but everyone keep in mind, these videos are not representative of actual 24fps gameplay.
 
I prefer motion blur in for realistic-looking games if 60 frames per second is used; otherwise, you end up capable of seeing details of objects moving past you at fast speeds so clearly that it looks unnatural.

It's just like how if you wave your hand in front of your face rapidly, you can't make out small details on it.


At 24-30 frames per second, the frames flash at such a speed where I can process them each thoroughly, so for whatever reason, I'm completely fine without motion blur.

And for the record, 24 frames per second and above leaves me not even thinking about FPS unless I consciously bring it up with myself. Only below 24 frames per second do I start noticing that something is wrong. Maybe it's because I was raised in the Nintendo 64 days, back when 24 frames per second was used in games like The Legend of Zelda: Ocarina of Time.
 
This.. people ways forget that 24fps only looks goid cause of motionblur. Take that out movies would look like shit.

Ok well the combination of 24fps and motion blur has created based on how we've seen it, a certain look and feel that has stuck over the years.

24 FPS would feel like shit in a game.

Thanks for your input, seems you lack any sort of reading comprehension.
 
No 24FPS is not about better graphics, it's about the feel, this is exactly what I'm talking about, people are so damn misinformed when it comes to this stuff. Yes most of the time when a dev goes for 30fps they are going for visuals... however when they say "filmic look", they're talking about how it looks in motion, similar to the soap opera vs. movie argument.

Then I think were having 3 "chains" of thought here, and I just realized it now. Im not a 60fps enthusiast, but really wanting a game to run at 24fps (like target to 24 and lock at it) woulndt be a good idea. And im pretty much sure that this wasnt what the develepers inteded saying on that interview.

I know it would feel less worse then that WD capture (since they would apply some motion blur), but at 30 fps it just feels better. And movies are different then games, so not really comparable at the same axact frame rate.
 
30fps is more smooth on consoles than pc. As long as it's locked, 30fps is perfectly fine.
That's not a computer/console thing, it's a TV/monitor thing if anything. I'm not sure what point exactly, maybe it's lower response time for the LCD in a TV versus a monitor so you end up with a bit of natural ghosting to smooth it out, or TVs process the image in that way for TV shows and movies. In either case, no, plug a console into a monitor, or a computer into a TV and cap the game at 30 fps, and you will see the difference between the two displays there.
Not below 30 anymore. At this point I can agree... every developer should look for stable 30fps minimum. But if most of them opt for releasing games at 30 fps (stable) im perfectly fine to use the rest of the performance to upgrade the game world. Since this is all about The Order, lets just use that as an example.

http://a.pomf.se/zsnime.gif

This game has an impressive level of details, and even targeting 30 it looks PERFECTLY playable (very distant from that WD gif at 24fps), I dont mind this being 30 at all. Not really! And that said, I wouldnt sacrifice anything of the visuals there, that seems so atmospheric and unique, just to run it at smooth 60 frames per second.

And thats what the dev said I believe. He wanted that immersion, and he wouldnt reach that at 60fps, so he chosed to let it on 30fps and it would feel more "filmic" then at 60fps and visually downgraded. But he also wouldnt sacrifice it to 24fps to improve it more graphically, because then it would really hurt the gameplay.
Yeah, some games are fine because they're slower paced, really can look that much better, whatever, and likely won't actually be THAT much improved by 60 fps. Same way that a game built for 60 fps can be wrecked by going to 30, I can't imagine doing this with the likes of mainline DMC or MGR. I do prefer 60, but so long as it's a stable 30 fps or at least does something so over the top that it can justify going under (Shadow of the Colossus and taking on colossi in open arenas) then I don't mind. Don't really need, say, most non-action RPGs at 60 afterall, even if I prefer racers and practically need 2D platformers at 30. And there's exceptions there too: I was mainly bugged at Rayman Origins being 30 on 3DS because of other platformers like Mario being 60 there and the fact the game was MADE to be 60, whereas Little Big Planet and Puppeteer were made for 30 so whatever grievances I have with those games it's not primarily with the fps... though the lack of v-sync in LBP1 at least can be irritating.
 
Real life has motion blur and grain and a host of other imperfections. 60fps looks unnatuial because of this. It's not just because we haven't got used to it as sports programs have been broadcasting in this format for decades. It looks cheap and always will do.

did you know your eye creates the effects of motionblur and depth, and that these apply to images in a video game even?

It is just that a video game moves at a perframe rate, unlike real life, which has an infinite exposure time and infinite framerate.

30 FPS should look even MORE unnatural than 60fps. More frames = better for your eye to create the natural effects of motionblur.
 
His post insinuated COD=60fps and sales great, therefore 60FPS= better sales

I flipped it with Halo= 30fps therefore fps doesn't matter.

I don't think thats how that works.

Which halo are you talking about runs at 30?

Maybe I am wrong, haven't played halo in a bit, but the newer ones or new-ish ones run at 60
 
48+ FPS in movies need changes in cinematography to go along with that frame rate. Lighting, sets, costume design, choreography, special effects, all need to be tailored to the higher temporal "resolution" for lack fo a better term.

Taking The Hobbit as an example, the 24 FPS version of the fight scenes were your typical movie tropes. Blurry and shaky as hell, there's no way to tell what the hell is going on except for the one or two characters directly in front of the camera.

The 48 FPS version brought out the incredible choreography of the whole thing. you could see the detail in all the movement, the special effects, the customes, etcs, EVERYWHERE. You could follow the fight in the background between a single dwarf and 5 goblins just as well as the fight happening in the foreground. A fight that went completely unnoticed when I watched the 24 FPS version - because it was a tiny smear in the background.

Probably the main thing that sold me on high frame rate movies.

Agreed. Worth driving 2 hrs to the theater several cities over for.
 
Like, real life movement feel natural. It just happens. You don't notice a framerate.

24fps movies are like that. When people move it feels like how my eyes see real people.

60fps is more like "look at me I'm sooo smooth!" Feels artificial

No offense but you have no idea what you're talking about. 24fps is nothing like "real life", it's simply the best amount of frames per second to give the illusion of movement without looking choppy like stop motion. Our eyeballs don't normally take in frames of what we see unless we're blinking like a mad man and that would look rather bizarre.

The thing with the illusion 24fps creates is it allows our brain to fill in the blanks and give the picture a sense of life, you're literally just looking at a slideshow of pictures. If you think that's what your eyes see I don't know what to tell you.

60fps prolly feels more "artificial" to you because it removes that illusion that 24fps creates that you're used to/have come to associate with films. It's simply a case of being used to something to the point of it being the norm, not to mention currently it looks more appealing stylistically (higher fps films are still very much in their infancy).
 
Top Bottom