In defense of the "filmic" look.

Framerates around 24fps have noticeable stuttering even if only on a somewhat subconscious level when it comes to movies most of the time where we tend to only notice stuttering when the camera pans at a certain speed (which is why they avoid that). Thing is, for storytelling, this super-subtle stuttering actually has a particular psychological effect that works well for cinematic purposes. It's almost like you're aware that you're watching a fast photo flipbook telling you a story. Speed things up to 60fps and this effect is lost, giving you the 'reality tv' feel. Neither is better than the other but they both serve different purposes.

This is my own theory on the subject matter and I strongly believe in it from all the experience I've built up with both CG, real-time graphics and real-life cinematography.

I'm just quoting myself here cause I'm right and most of you are just bat shit insane when it comes to this topic. Srsly.

On a more serious note (not that I didn't sorta mean what I just said), I do simplify things a bit in the above explanation but I am sure that that's what it comes down to. I say I simplify things cause I'm not taking motion blur/shutter speed into account and such but overall that just means that fps is not the only factor and in that sense things are slightly more complex but not by much.

In short: I just explained everything you need to know to understand the issue in that quote. Read it. /thread
 
It probably only actually works when it's subtle and the game isn't THAT dependent on low LCD response times. But yeah, generally it's better to have lower refresh rates, but I imagine any decent quality TV from within the last decade will do a pretty good job there, unlike monitors from around the late 90s/early 2000s.

I do wonder if there might be some TV processing to account for too. At any rate I'm sure that whatever reason a game "looks better at 30 FPS" on PC is going to be tied to a TV and when hooked up to it WILL look "better" in the same way a console does. And hopefully that's in contrast to a PS4 or at least XB1, not a PS3 or 360.

If you have the TV's motion interpolation turned on it can make it look smoother, but you get horrible latency, making nearly all games unplayable.

Nothing says that this has to be the TV though, plugging a controller into a PC and playing at 30FPS will prove that.
 
That 30fps feels smoother on consoles than pc.

Thanks for responding by repeating your platitude. That does not make it real.
The idea is the perception exists for a reason.
Perceptions are common, common perceptions are also pretty common. They are just perceptions of intense subjectivity till someone proves to me that said perception touches some sense of common truth. I do not share in that perception.

No one has proven this perception to be universal at all regarding the whole "console games look better at 30fps than PC games."

I imagine the perception came about much like how my uninformed friend developed it. His computer was not up to the task of maintaining a stable fps. he did not understand how Vsync and framelocking was activated on PC. As soon as I showed him, he said "smooth as a console game."

To which I chuckled.
I've tested with motion blur turned off and it still looks nothing like the Watch_Dogs videos Dennis posted.
Maybe his encoding broke? Not sure.

Also, I decided to enjoy this whole debate and show a scene (obviously not interactive for the viewership, which would infinitely prove 60fps' worth) where high motion is a detriment to a low framerate. I also chose a game with one of the best motionblur implementations next to Ryse, which would be Metro Last Light. Rendered separately at locked framrates with Vsync so as to make the motionblur stepping work correctly.

24fps
http://a.pomf.se/klrvan.webm
60fps
http://a.pomf.se/orutqz.webm
 
As I stated in another thread, motion blur should not be used as a band aid to cover up a game unsteadily juddering along at 20fps - 30fps. That crap needs to stop.
No, I don't believe it does at all in this case. The argument is regarding a "filmic look" in games, and films would judder if motion blur didn't exist. Games wishing to closely ape movie ascetics need to implement motion blur to get closer to the intended look.

Also, that webm we're referring to is about 12 inconsistent FPS apparently and without blur at that. Strangely enough, the poster never responded to questions or issues raised initially, never mind after having his "evidence" exposed for what it really is.

(A quick review of our conversation):
I laughed.
 
I find this argument both ridiculous and hilarious. The amount of times this excuse of a "filmic" or "cinematic" "feel" or "look" are used to justify 24fps 30fps etc is completely meaningless. Ultimately you are playing a video game and not a movie. That "look" and "feel" that is claimed is never true.

The only occasions where this is acceptable is if the claims are indeed genuine and you are full on attempting to emulate a films look. The Order is probably the closest to this at the moment, although I am not sure if this "filmic" look/feel has ever been stated as their intention, while they have gone full anamorphic with film grain and typical fps. Even then I still won't be convinced till we see the final product and how it plays out and if they indeed are aiming for that and the gameplay itself is driven to be filmic. There are almost no games that follow through with this idea to indeed emulate that look and feel from a gameplay perspective.

Any game that simply chooses such a solution of low frame rates as such should just out themselves and be honest - "this is the best graphical fidelity we can achieve with what we are working with and we don't want to compromise it to improve it's framerate because x y z". At least that is honest.
 
If you have the TV's motion interpolation turned on it can make it look smoother, but you get horrible latency, making nearly all games unplayable.

Nothing says that this has to be the TV though, plugging a controller into a PC and playing at 30FPS will prove that.
While it may just be come kb/m versus controller thing, I have to assume A LOT of that "feels smoother" is still in the display, and I was personally going by experience with FFXIV on my PC versus a TV WITHOUT motion interpolation.

It may also be due to display expectations, but I swear a little bit of extra blurring on the TV helped make it feel a bit smoother or at least "acceptable" at 30 fps locked, even if 60's out and out better on either display.
 
Guys, I don't know in what alternate reality you live, but no way in hell I feel real life less smooth than anything I saw running at 60FPS.
 
What is cheap about broadcast quality cameras?

And if resolution is the reason 60i looks cheap, then a 480i CRT, or an ad being shown in a small online video window, would ruin that 'film look' (3:2 pulldown aside).

Old video cameras, especially the ones used for soap operas, have much worse color range and depth than film. On a 480i CRT, film will still benefit from being "downscaled" as opposed to the native resolution of a video camera, which would be 240 lines.
 
Whenever I play something that is CG heavy (like FFXIII) I turn on auto motion plus to high for that 60fps look. Sure, it introduces some weird artifacts when the image pans too fast horizontally, but it just looks better. That's just my opinion, of course.
 
Whenever I play something that is CG heavy (like FFXIII) I turn on auto motion plus to high for that 60fps look. Sure, it introduces some weird artifacts when the image pans too fast horizontally, but it just looks better. That's just my opinion, of course.

It also adds like 100 or 200ms of input lag.
 
Guys, I don't know in what alternate reality you live, but no way in hell I feel real life less smooth than anything I saw running at 60FPS.
I did speculate on a possible uncanny valley effect, but if I'm right that'd be eliminated by even higher fps, not lower. Maybe if we can get true 240hz monitors it can be eliminated as far as the eye can tell.

EDIT: Well, there's motion blur too, but I kind of imagine a image running with that many frames WOULD appear to be naturally motion blurred. Then again we're talking a display emulating what we see and there's all sorts of little details that can't necessarily capture for one reason or another.
 
Guys, I don't know in what alternate reality you live, but no way in hell I feel real life less smooth than anything I saw running at 60FPS.

If you move your hand close to your face you'll see a lot of motion blur or eve if you just move your head around.I guess that is where people get confused.

With proper motion blur the highest possible frame rate wouldn't cause any issue.

The problem with movies is that motion blur is directly tied to the exposure that in turn is directly tied to the frame rate.
 
It also adds like 100 or 200ms of input lag.

The input lag doesn't hinder gameplay in something like FFXIII, though. I would never use it in a game like, let's say, Street Fighter 4 or Call of Duty.

edit: A better example would be Halo, since the two I mentioned are already 60fps.
 
]I guess my big problem with the "cinematic" and "filmic" approach is that I don't see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg
 
The "filmic" look of movies at 24fps is caused by post-processing. No game looks more like a film because it is running at 24fps. Get any action film, pause the video during a fast scene and take a screen shot. The thing will be a blurry mess, yet will look high quality when in motion. Due to the real-time nature of video-games these high-quality motion interpolation features cannot be used and instead some games fall back to using different techniques to achieve this motion blur. Personally they give me a headache and do not achieve the desired effect due to the fact that they are noticeable, whereas when watching the Dark Knight for example, you do not notice the extreme motion blur due to the high quality of the interpolation technique.
 
While it may just be come kb/m versus controller thing, I have to assume A LOT of that "feels smoother" is still in the display
Maybe, I've got my PC hooked up to the TV alongside a few consoles and I've not noticed one being smoother than the other at similar frame rates.

That said, I can't play a game that has VSync enabled if I'm using a mouse/keyboard. I can barely control a simple pan around due to the extra latency, yet I can quite happily play for hours with VSync on if I'm using a 360 controller. Without that I'd probably downplay the effect different input devices have on percieved "smoothness" whilst playing.
 
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?

OH GOD SO HORRIBLE 30fps 720p GAWD DAMN MY EYES ITS SO SHITTY
ibkaevijqriiha.gif


Holding to a 1080p 60fps standard for this gen will hold a lot of cool potential gameplay and graphics back. I would argue a lot of 60fps games last gen were basically better looking PS2 games. Lots of small corridors, arenas, or tracks or arenas. Very few 60fps games last gen really pushed new gameplay or amazing graphics (Ill give you Rage, Burnout, and COD), so holding it as the gold standard doesnt mean it results in better games. Most of the best franchises last gen (Last of Us, Gears of War, God of War, Journey, Halo, Flower) that pushed hardware to the limits resulting in some stellar games that would not have been the same if they held to a 60fps standard.
 
I suppose framerates and, by extension, those who argue over them, have never phased me. Not to say I cannot tell the difference (I can), but it has never been such a dealbreaker that I am inclined to argue or even discuss it in the first place. If a developer opts to do one number over the other, then all right.
 
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?

Resolution and fps are the last thing I look for in a game.....as long as its at least 60 fps who cares?

Not that I really think this, but you can see the point.
 
I'm just quoting myself here cause I'm right and most of you are just bat shit insane when it comes to this topic. Srsly.

On a more serious note (not that I didn't sorta mean what I just said), I do simplify things a bit in the above explanation but I am sure that that's what it comes down to. I say I simplify things cause I'm not taking motion blur/shutter speed into account and such but overall that just means that fps is not the only factor and in that sense things are slightly more complex but not by much.

In short: I just explained everything you need to know to understand the issue in that quote. Read it. /thread

I like this. Being a cinephile guy I totally get it... Danny Boyle's 28 Days felt somewhat artifical because it was shot on 50/60 fps.

In games, no thank you. I like games because I'm in control, not for the story.
 
How many fps does real life run in?

Yes, this is the clarification I was going to ask for.

How could 60 fps really be called "fast forward" (lol) relative to "real life"?


Also, for people that just have to have 24 fps, couldn't a game be built to have all physics calculated at 60 hz or higher, have the inputs polled at 60 hz or higher, etc, to give the fluidity of control desired, while presenting the visual information at 24 fps the way some people like it? Add motion blur until people are satisfied - wouldn't that do the trick?
 
]I guess my big problem with the "cinematic" and "filmic" approach is that I don't see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

http://dudelol.com/oldimgs/hdr-in-photography-vs-hdr-in-video-games.jpg
Yeah, it's really sort of frustrating at times the Hollywood envy that seems to be in games. Doubly so when it actually glitches, you get stuff like in Tomb Raider reboot water effects on the "camera" that don't ever go away, so you have to quit out and load it back up. Sometimes it's amusing to see, like a hornet crawling on a lens in MGS3, but it really does feel sometimes like they're not trying to bring us into a world but try to make it like we're watching a movie or TV show. And I'm not even THAT big on immersion, but being interactive I'd rather they err on the side of what actually makes sense for you being there rather than replicate our filming technology when it doesn't make sense or serve any purpose.
Maybe, I've got my PC hooked up to the TV alongside a few consoles and I've not noticed one being smoother than the other at similar frame rates.

That said, I can't play a game that has VSync enabled if I'm using a mouse/keyboard. I can barely control a simple pan around due to the extra latency, yet I can quite happily play for hours with VSync on if I'm using a 360 controller. Without that I'd probably downplay the effect different input devices have on percieved "smoothness" whilst playing.
Similarly I swear on PS4 FFXIV looked basically identical to when I played the PC version on the TV. I tend to not have that issue with input lag, the worst is when I've tried streaming stuff like Onlive, though that can be manageable for the right games. Aiming can be a bitch though, which funnily enough I imagine could be what screws over any hypothetical cloud streaming future while shooters are the dominant genre.
 
I dont think anyone has mentioned(probably) but 60 fps(and higher) feels much more different to 30 fps and lower.

If you have games with 10 ms response time and lower as opposed to 30 ms plus, its a huge amount of difference. Once your used to playing at that high a frame rate, playing at 30 fps is like playing in slow motion, the controls feel sluggish and not very responsive.

Online games this has a huge effect especially, but any game generally that is skill based you will play better at a higher frame rate, there's more windows of opportunity.
 
Also, for people that just have to have 24 fps, couldn't a game be built to have all physics calculated at 60 hz or higher, have the inputs polled at 60 hz or higher, etc, to give the fluidity of control desired, while presenting the visual information at 24 fps the way some people like it? Add motion blur until people are satisfied - wouldn't that do the trick?

Nah, because even if you poll at higher than the visual refesh rate, the player still recieves feed back in the delayed fashion waiting for the next frame.

It could improve the accuracy of physics models though (it does for example).
 
Also, I decided to enjoy this whole debate and show a scene (obviously not interactive for the viewership, which would infinitely prove 60fps' worth) where high motion is a detriment to a low framerate. I also chose a game with one of the best motionblur implementations next to Ryse, which would be Metro Last Light. Rendered separately at locked framrates with Vsync so as to make the motionblur stepping work correctly.

24fps
http://a.pomf.se/klrvan.webm
60fps
http://a.pomf.se/orutqz.webm

The 24fps one looks absolutely fine there. Fast camera movement would make it look worse but still not eye-searing like the Dennis one from before.
 
]I guess my big problem with the "cinematic" and "filmic" approach is that I don't see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg
I think the nuances of camera work are so ingrained into our perceptions that its worth doing these things. I have no problem with it, to me its visual effects at this point. No it's not realistic, but then I am watching it on a TV screen in any case. I'm not expecting a totally realistic interpretation of the world, I expect it to look like everything else I see on a TV screen
 
I'm perfectly fine with a 30fps game on a console but I don't understand why they have to beat around the bush as to why they chose 30. It's because the system is not powerful enough to do 60fps, period.

(Last of Us, Gears of War, God of War, Journey, Halo, Flower) that pushed hardware to the limits resulting in some stellar games that would not have been the same if they held to a 60fps standard.

I get what you're saying, but oh man, Gears at 60 was soooooo much better.
 
I think the nuances of camera work are so ingrained into our perceptions that its worth doing these things. I have no problem with it, to me its visual effects at this point. No it's not realistic, but then I am watching it on a TV screen in any case. I'm not expecting a totally realistic interpretation of the world, I expect it to look like everything else I see on a TV screen

I play most of my games on a computer monitor, where i also watch videos, look at and produce artwork, read and write papers. I guess I dont have that same expectation of the screen anymore, and i think if anything, thhe screen in general seems to be trending towards a more ubiquitous window as opposed to a camera lens. Besides that, things like chromatic aberrations and blurry effects are often glitches and mistakes that filmmakers and photographers avoid, in gaming they're trends that seem to be wanted.
 
I was just watching The Hobbit DoS bluray and some of the sets look strange even at 24fps. They look fake. Maybe it's the lighting? I don't know. I want to see other movies at 48fps or 60fps.
We need James Cameron to show us the true power of 48fps
 
As the person playing, it was absolutely playable. Not as great as 60fps of course, but I was able to play. I was even able to parry an enemy, and dodge through the turtle enemies swings.

It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.

Enemy attack animations play out slower the lower the framerate is :P. As in, the speed is locked to the framerate, so technically, playing at 24 FPS should make the game easier.
 
Enemy attack animations play out slower the lower the framerate is :P. As in, the speed is locked to the framerate, so technically, playing at 24 FPS should make the game easier.

I know of that bug, but it's absolutely not in a 1:1 ratio, and it's not ALL of the enemies.

But I'll take it sure, it might have been a bit easier, I don't know for sure lol.

Journey is barely even a game.

Oh god I really hope you're kidding
 
I thought all pc gamers were throttling Fps to 24. Why would anyone want 60 Fps? It's not movie like and it is suspiciously smooth.
 
As i understood, we can neither measure nor detect time shorter then 1 Planck unit.
That doesn't mean that time shorter then 1 Planck unit doesn't exist.

I'm sticking with infinite.

I'm not familiar with plancks... But.. Wouldn't the 'refresh rate' determine the 'fps' of real life? In a steady stream of light, what's the shortest distance between two photons and how long does it take a photon to move that distance?
 
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?

OH GOD SO HORRIBLE 30fps 720p GAWD DAMN MY EYES ITS SO SHITTY
ibkaevijqriiha.gif


Holding to a 1080p 60fps standard for this gen will hold a lot of cool potential gameplay and graphics back. I would argue a lot of 60fps games last gen were basically better looking PS2 games. Lots of small corridors, arenas, or tracks or arenas. Very few 60fps games last gen really pushed new gameplay or amazing graphics (Ill give you Rage, Burnout, and COD), so holding it as the gold standard doesnt mean it results in better games. Most of the best franchises last gen (Last of Us, Gears of War, God of War, Journey, Halo, Flower) that pushed hardware to the limits resulting in some stellar games that would not have been the same if they held to a 60fps standard.

Flower and Last of Us are 60 fps on PS4. The new Halo will be 60 fps. Downgrades all around.
 
Top Bottom