Can we please stop with the whole "60 fps is not cinematic" argument.

A lot of it is about expectations. The way people think something is supposed to look can sometimes override the technical advantages or disadvantages of the various options available. Familiarity is a critical factor that has to be considered when making something new, whether as a feat of art or engineering.

Do you think context might matter here, just a little? 60 FPS doesn't mean the same thing for film as it does for computer graphics that are being rendered in real time and controlled by a player. There are lots of things to take into account, not a simple hierarchy of one thing being "the best" and a bunch of deluded people denying it.

It's been said before. 60fps for gameplay is fine, the real problem are the cutscenes.
 
It's been said before. 60fps for gameplay is fine, the real problem are the cutscenes.

If the cutscenes are meant to evoke a movie-like feel, sure, but that's just another consequence of consumer expectations. It also depends on the context, since not all games are necessarily going for a traditional "big screen" look; if a game was meant to evoke the feeling of being shot on a handheld camera, I don't think anyone would bat an eyelash if its cutscenes ran at 60.
 
If the cutscenes are meant to evoke a movie-like feel, sure, but that's just another consequence of consumer expectations. It also depends on the context, since not all games are necessarily going for a traditional "big screen" look; if a game was meant to evoke the feeling of being shot on a handheld camera, I don't think anyone would bat an eyelash if its cutscenes ran at 60.

It's a problem since game graphics and animation are still far away from photorealism so a higher framerate simply allows to break immersion more easily by noticing much more clearly all the defects in the footage.
 
I'm fine with locked 30 fps. The problem with last gen consoles is that most of the times the actual performance was some 20-30fps variation and not locked 30 FPS.

And I do think 60 looks strange in some games
 
I'm doing it in my monitor right now and no, I see distinct still images of the cursor.


The human eye doesn't apply motion blur to a high-framerate video.


As I said, soaps are cheaper to produce. Why don't they switch to 30fps?

To the human eye, a video being displayed at a large framerate is indistinguishable from real life.

Lets say in 100 years project morpheus 20 comes ou with a 5000k display, 10000hz and it renders a scene of a ball flying past your face without applying post-processing motion blur. (So each frame is a static image with no bluring.)

To your eyes that scene would look the same as a ball flying past your face in real life. Including the motion blur of the ball. Same as if this VR system had hand tracking and you waved your hand in front of your face real fast, it would look exactly the same as it does in real life. At what frame-rate is this achieved? I don't know the answer to that sorry.
 
To the human eye, a video being displayed at a large framerate is indistinguishable from real life.

Lets say in 100 years project morpheus 20 comes ou with a 5000k display, 10000hz and it renders a scene of a ball flying past your face without applying post-processing motion blur. (So each frame is a static image with no bluring.)

To your eyes that scene would look the same as a ball flying past your face in real life. Including the motion blur of the ball. Same as if this VR system had hand tracking and you waved your hand in front of your face real fast, it would look exactly the same as it does in real life. At what frame-rate is this achieved? I don't know the answer to that sorry.

Vision isn't experienced in frames so no.
 
I hope someone has already said this, but "more cinematic" isn't exactly a plus for an interactive media format.

Let video games be VIDEO GAMES, FOR FUCK SAKES.
 
I hope someone has already said this, but "more cinematic" isn't exactly a plus for an interactive media format.

Let video games be VIDEO GAMES, FOR FUCK SAKES.

but the best thing about videogames is they can be anything, including cinematic!
 
Hey, hey hey hey guys.

Games are not, despite marketers telling you otherwise, a cinematic experience.

They absolutely can be. But framerate isn't really relevant to whether something is a cinematic experience or not. Art direction, story, and pacing are far more relevant.
 
People are used to it, that's all there is to it. They've enjoyed 30fps games for so long that when seeing 60fps it feels weird and like fast forward. You soon adjust (as I did when moving to PC gaming) and end up loving it.

As for movies I think it's the same thing. People tell themselves that movies HAVE to be 24fps and anything above is disgusting. It's only because 24fps was initially a cost limitation in early cinema and it's stuck around. now anything higher looks weird because we've had years of 24fps

I bet if movies started off in 60 and then people suddenly saw 24 they'd say it was terrible and not more "cinematic"
 
I'm doing it in my monitor right now and no, I see distinct still images of the cursor.
Are you switching your mouse back and forth extremely quickly and not precisely eye-tracking the cursor? Because while I don't have a CRT on hand, I know what they look like. Because of their pulsed brightness, you can also get ghosty imagery by waving your hand in front of the screen, though that's obviously a less convincing test because there are other factors at work.

The human eye doesn't apply motion blur to a high-framerate video.
If it was absurdly high-framerate, yes it does, for the same reason that it applies motion blur to real life.

Think of it this way: in terms of the photons reaching your eyes, there would be basically no meaningful difference between a dark ball moving in front of a bright screen, and a display playing back a 1000000fps no-motion-blur video of a dark ball moving in front of a bright screen.

Vision isn't experienced in frames so no.
No, but a sufficiently high framerate will constitute an indistinguishable approximation to "not in frames".

Much in the same way that the real world isn't rendered onto a fixed pixel grid, yet a highly realistic path-traced render at an absurdly high resolution can be indistinguishable from a complex photograph.
 
They absolutely can be. But framerate isn't really relevant to whether something is a cinematic experience or not. Art direction, story, and pacing are far more relevant.
Why is everything relevant BUT framerate? It certainly changes the feel of a video.

Are you switching your mouse back and forth extremely quickly and not precisely eye-tracking the cursor? Because while I don't have a CRT on hand, I know what they look like. Because of their pulsed brightness, you can also get ghosty imagery by waving your hand in front of the screen, though that's obviously a less convincing test because there are other factors at work.
If I wave my hand in front of me I get motion blur. If I wave my hand in front of the monitor I see discrete frames due to the stroboscopic effect of the display.

If it was absurdly high-framerate, yes it does, for the same reason that it applies motion blur to real life.

Think of it this way: in terms of the photons reaching your eyes, there would be basically no meaningful difference between a dark ball moving in front of a bright screen, and a display playing back a 1000000fps no-motion-blur video of a dark ball moving in front of a bright screen.

No, but a sufficiently high framerate will constitute an indistinguishable approximation.

Much in the same way that the real world isn't rendered onto a fixed pixel grid, yet a highly realistic path-traced render at an absurdly high resolution can be indistinguishable from a complex photograph.

You know, you've said that cameras and the eyes act very similarly. High framerate footage captured by cameras present LESS motion blur than low framerate footage so your conjecture is a contradiction of the observed phenomena.
 
If I wave my hand in front of the monitor I see discrete frames due to the stroboscopic effect of the display.
To claim that they sit completely discretely in your visual perception, though, is to either claim that you're running your CRT at absurdly low refresh rates, or else have an astronomically better temporal resolution than the typical human. I know exactly what it looks like to wave my hand in front of a CRT, the "frames" are not perfectly distinct in my vision. That is, incidentally, a huge part of why the effect looks so weird.

You know, you've said that cameras and the eyes act very similarly. High framerate footage captured by cameras present LESS motion blur than low framerate footage so your conjecture is a contradiction of the observed phenomena.
I'm not seeing the contradiction. Where did I say that high-framerate footage ought to have more motion blur?

Cameras execute time-averaging based on the shutter timing, and the shuttering is usually done as a fraction of the total frame time delta (usually in the neighborhood of 1/2 or "180 degrees"). At higher framerates this means the camera is shuttered for less time on a particular frame and thus each individual frame has less motion blur to it. This is a no brainer. Where do I disagree with it?
 
To claim that they sit completely discretely in your visual perception, though, is to either claim that you're running your CRT at absurdly low refresh rates, or else have an astronomically better temporal resolution than the typical human. I know exactly what it looks like to wave my hand in front of a CRT, the "frames" are not perfectly distinct in my vision. That is, incidentally, a huge part of why the effect looks so weird.
Guess the difference is in either manufacturers or our perception of reality.

I'm not seeing the contradiction. Where did I say that high-framerate footage ought to have more motion blur?

Cameras execute time-averaging based on the shutter timing, and the shuttering is usually done as a fraction of the total frame time delta (usually in the neighborhood of 1/2 or "180 degrees"). At higher framerates this means the camera is shuttered for less time on a particular frame and thus each individual frame has less motion blur to it. This is a no brainer. Where do I disagree with it?

Me: "The human eye doesn't apply motion blur to a high-framerate video."

You: "If it was absurdly high-framerate, yes it does, for the same reason that it applies motion blur to real life."

We know that the human eye does NOT apply motion blur to 30, 60 nor 120 fps footage so that rules out low framerates.

Yeah, and for video games higher = better.
During gameplay, yes.
 
What is considered cinematic is a set of moving goal posts. Digital and 3D were previously considered cheap gimmicks and are now industry standard. Much to the chagrin of traditionalists, but passionate preference by some hasn't stopped the change. And so it will follow with high frame rates, they will become standard and hence cinematic. I'm betting on the new Avatar movies creating the industry momentum, but it may happen later with even more advanced tech at 120fps or perhaps even 240fps. 24fps for digital 3D is awful. Film looks better for colour and contrast, but is super expensive at high frame rates and under developed for 3d. 24fps being more cinematic is about as useful as leg warmers in the eighties.

I disagree with the current Gaf consensus that frame rates in games and movies are different due to one being an interactive medium and the other a passive medium. This interactive/passive discussion is something that differentiates games from movies. It is completely irrelevant to the frame rate discussion. If you watch Avatar 2 at 60fps and play MGS5 GZ at a rock steady 60fps the effects due to the frame rate perceived by your eye are exactly the same. The argument that because you are interacting with the game your perception of the frame rate increases can be applied to any variable. You could apply this to resolution for example and yet we quite happily discuss resolution in games and blurays and understand how they relate.

Relate is the important word here because there are key differences to how frame rates are affected by there source medium. Film doesn't drop frames for example.

My final point is that I disagree with the idea that game play should be 60fps and cut scenes 24fps. By doing that a visual inconsistency is created between in game and cut scene. Thereby creating a disconnect between the two. When cinematic games are done well the jump between game play and cut scene is seamless. Here the active and passive argument returns (hence QTEs), by throwing in different frame rates you are exacerbating this very problem. It's similar to why in engine cut scenes (or at least the appearance of) are preferable to pre rendered. Gameplay and cut scene need to match visually.
 
The "cinematic" look isn't because of the frame rate, it's because of the exposure time. That's where the blur comes from. You can shoot 24fps with fast film and it will look "weird." There has been a century of motion pictures, and they experimented constantly to create the most natural looking image with the minimal amount of film. 24fps was a minimum, and it relied on motion blur.

I don't think that's something games should strive for. Why not 200fps? It looks better. I rather have games look like real life than cinema. Which, ironically, real life is what film was trying to convey.
 
I love how most of those in favor of higher framerates simply dismiss the perception and preference of those who don't like it.

I don't think that's something games should strive for. Why not 200fps? It looks better. I rather have games look like real life than cinema. Which, ironically, real life is what film was trying to convey.

Not really. Lighting in films is completely unrealistic.
 
I love how all those in favor of higher framerates simply dismiss the perception and preference of those who don't like it.

High-def gaming has only been around for just over a generation. Just because games don't look appropriately cinematic now doesn't mean we can't go to higher frame rates. Who's to say a high frame rate wouldn't lend itself to a more convincing cinematic experience utilising the proper post processing effects?


Not really. Lighting in films is completely unrealistic.

"Realistic" lighting sometimes can't even expose the film. Film uses artificial lighting in order to convey a sense of real light.
 
Guess the difference is in either manufacturers or our perception of reality.
Probably the latter.

Even on a CRT, typical persistence of vision should cause your eyes to be able to pick up two or three copies of the cursor simultaneously when sweeping it very quickly; the screen has crisp response, but the typical human eye can positively retain images in the ballpark of 1 frame at 30fps.

You can easily temporally distinguish the different cursors, because your brain can predict them and can detect the differences in arrival and fade times. But the visual phenomena will exist simultaneously sometimes. The cones and rods in the human eye, and the processing of incoming imagery by the brain, does not have a perfect response.

Me: "The human eye doesn't apply motion blur to a high-framerate video."

You: "If it was absurdly high-framerate, yes it does, for the same reason that it applies motion blur to real life."

We know that the human eye does NOT apply motion blur to 30, 60 nor 120 fps footage so that rules out low framerates.
That doesn't contradict me at all. Obviously low framerates don't produce motion blur, because there's not enough temporal information to do so. That's literally the entire basis of my argument that they produce ghosting instead; rather than capture a smear of continuous imagery that gets averaged (which yields motion blur), your eyes capture the image in one location and the image in the next location and they get averaged (which yields two images superimposed, aka the ghosting I keep referring to).

It's not that the eye does more averaging to produce motion blur when viewing imagery at higher framerates; it's that the same averaging is done, but because it's averaging images in more locations, the result looks less like ghosting and more like motion blur.
 
High-def gaming has only been around for just over a generation. Just because games don't look appropriately cinematic now doesn't mean we can't go to higher frame rates. Who's to say a high frame rate wouldn't lend itself to a more convincing cinematic experience utilising the proper post processing effects?
And I'm not interesting in suppressing these advances. I'm simply asking for an option to not have to endure them. HFR movies can also be bought in 24fps versions. Programming such a simple toggle in a game should be trivial to accomplish.

"Realistic" lighting sometimes can't even expose the film. Film uses artificial lighting in order to convey a sense of real light.
That could be in very dark conditions. For most lighting in the real world that isn't an issue.

Probably the latter.

Even on a CRT, typical persistence of vision should cause your eyes to be able to pick up two or three copies of the cursor simultaneously when sweeping it very quickly; the screen has crisp response, but the typical human eye can positively retain images in the ballpark of 1 frame at 30fps.

You can easily temporally distinguish the different cursors, because your brain can predict them and can detect the differences in arrival and fade times. But the visual phenomena will exist simultaneously sometimes.
I can see more than one cursor, but that's not motion BLUR. That's a stroboscopic effect. Completely different things.

That doesn't contradict me at all. Obviously low framerates don't produce motion blur, because there's not enough temporal information to do so. That's literally the entire basis of my argument that they produce ghosting instead; rather than capture a smear of continuous imagery that gets averaged (which yields motion blur), your eyes capture the image in one location and the image in the next location and they get averaged (which yields two images superimposed, aka the ghosting I keep referring to).

It's not that the eye does more averaging to produce motion blur when viewing imagery at higher framerates; it's that the same averaging is done, but because it's averaging images in more locations, the result looks less like ghosting and more like motion blur.

It's a contradiction. Low framerate cameras have longer exposures and produce motion blur, not ghosting. According to you, you need more temporal information to produce the blur but as footage captured from high speed cameras show, that's exactly the opposite of reality.

Now, if you're going to keep using the same framework of how the vision works I'd like some sources because it clearly doesn't add up with the evidence.
 
Games aren't movies they're totally interactive play grounds with so much going on. More frames is better. I will turn high graphics down to just get that awesome motion of a high frame rate.
 
I can see more than one cursor, but that's not motion BLUR. That's a stroboscopic effect. Completely different things.
I never said that that was "motion BLUR", I said that it was ghosting, but I maintain that their cause is related.

The "motion BLUR" and "ghosting" wouldn't be completely different if the time delta between cursors being displayed was made absurdly small, so that you saw hundreds of blended closely-spaced cursors instead of a couple of large-spaced ones.

Think of it this way: if the CRT was strobing at 100,000Hz, could you distinguish its "stroboscopic effects" from doing things in front of a continuous light source?

It's a contradiction. Low framerate cameras have longer exposures and produce motion blur, not ghosting. According to you, you need more temporal information to produce the blur but as footage captured from high speed cameras show, that's exactly the opposite of reality.
You're misinterpreting what I'm writing.

What I mean causes ghosting is viewing a low-framerate video stream that does not have motion blur built into it. It's when you use a time-averaging filter, like your eyes, to capture discretely separated images in a brief span of time. (Your eyes will still "ghost"-blend low-framerate video that does have built-in motion blur, but you won't notice as much, because it's motion-blurred.)

Obviously low-framerate long-shutter film has motion blur instead of ghosting baked into it, because the camera wasn't recording from a low-framerate non-blurred feed, but instead recording from a continuous stream of imagery. If you DID record a low-framerate non-blurred feed with a low-framerate long-shutter camera, you WOULD get ghosting in the result.
 
I think it's safe to say and everyone agrees that gameplay at 60 is ALWAYS better, but why do I see people saying that in cutscenes 60 looks weird? I played MGS3 HD on 360 and the cutscenes running at 60 were delicious in comparison to the ones in the PS2 version.
 
I think it's safe to say and everyone agrees that gameplay at 60 is ALWAYS better, but why do I see people saying that in cutscenes 60 looks weird? I played MGS3 HD on 360 and the cutscenes running at 60 were delicious in comparison to the ones in the PS2 version.

The "30 fps is more cinematic" nonsense has been repeated so many times that a lot of people accept it as fact without taking a second to stop and think about how utterly ridiculous that notion is.
 
Why do games need to be cinematic in the first place? And in what world is 24 FPS considered good enough? It's only still the movie standard because of tradition, and perhaps because it would break the immersion for some people, but games don't have that problem.
 
When I look at footage of a television show in 48fps, it sometimes feels more like I'm a camera man, and that I am just right there watching these actors play their part. I can see all of the fake blood, the invisible harnesses, and the cardboard backgrounds that I may not have noticed in other films. It makes the show feel much less real. Maybe it is only that way because I am used to it like that. Who cares? 48 fps isn't suddenly the definition of cinematic just because 24 fps is old and one popular movie (the hobbit) is filmed in 48fps. 24fps is how we view most pieces of cinema, so it is logical that people would prefer cinematic games (or just cutscenes) to be in 30fps. Just because you prefer 60fps, doesn't mean that every single other person in the world should also. It is incredibly ignorant to attack people who say that they feel that 60fps games are less cinematic.
 
I never said that that was "motion BLUR", I said that it was ghosting, but I maintain that their cause is related.

The "motion BLUR" and "ghosting" wouldn't be completely different if the time delta between cursors being displayed was made absurdly small, so that you saw hundreds of blended closely-spaced cursors instead of a couple of large-spaced ones.

Think of it this way: if the CRT was strobing at 100,000Hz, could you distinguish its "stroboscopic effects" from doing things in front of a continuous light source?


You're misinterpreting what I'm writing.

What I mean causes ghosting is viewing a low-framerate video stream that does not have motion blur built into it. It's when you use a time-averaging filter, like your eyes, to capture discretely separated images in a brief span of time. (Your eyes will still "ghost"-blend low-framerate video that does have built-in motion blur, but you won't notice as much, because it's motion-blurred.)

Obviously low-framerate long-shutter film has motion blur instead of ghosting baked into it, because the camera wasn't recording from a low-framerate non-blurred feed, but instead recording from a continuous stream of imagery. If you DID record a low-framerate non-blurred feed with a low-framerate long-shutter camera, you WOULD get ghosting in the result.

Motion blur found in video footage is due to the exposure time. A single frame doesn't represent a single instant in time but a period of time in which objects might move, creating the typical stretched look known as motion blur. The higher the framerate at which a camera captures video, the lower the exposure time has to be leading to LESS motion blur.

If you have any evidence that support your theoretical framework of motion perception please share it because otherwise it seems completely made up in light of observations of the real world.

Framerate has nothing to do with something being cinematic or not.
Is that your expert opinion?

When I look at footage of a television show in 48fps, it sometimes feels more like I'm a camera man, and that I am just right there watching these actors play their part. I can see all of the fake blood, the invisible harnesses, and the cardboard backgrounds that I may not have noticed in other films. It makes the show feel much less real.

And that's exactly why so many of us dislike high-framerate photorealistic video.
 
Motion blur found in video footage is due to the exposure time. A single frame doesn't represent a single instant in time but a period of time in which objects might move, creating the typical stretched look known as motion blur. The higher the framerate at which a camera captures video, the lower the exposure time has to be leading to LESS motion blur.
I've said that exact thing multiple times, and I'm not sure what you think you're responding to.

I never, EVER claimed that high-framerate video tends to use more motion blur frame-for-frame. My claim is that, if the human visual system were to view absurdly (like, SUPER absurdly) high-framerate video, the time-averaging effect of vision would produce a motion-blurred result through exactly the same mechanism by which it produces motion blur when looking at "real" moving objects.

If you have any evidence that support your theoretical framework of motion perception please share it because otherwise it seems completely made up in light of observations of the real world.
I would, but unfortunately the phrase "persistence of vision" is wrapped up in an absolutely idiotic discussion amongst film theorists, and it's hard to google this stuff. Things like this VR discussion are some of the most interesting tidbits, if unfortunately not entirely hard scientific; this particular example uses the concept I've brought up plenty (though it only explicitly says "persistence of vision" once).

Suffice it to say, it's reasonably well recognized that the human visual system forms a temporal low-pass filter of sorts, time-averaging incoming stuff. It explains why stroboscopic ghosting is noticeable in cases where you're pushing a decent frequency*, it explains why CRTs and film projection don't flicker if you use a respectably high refresh rate**, and it explains why motion blur occurs in human vision***.

I can't fathom how you think my claims "seem completely made up in light of observations of the real world," except that you seem to be misinterpreting most of what I say. In any case, if my assumption about visual time-averaging is wrong, I also can't fathom what you think causes motion blur in human vision.

*Because your vision doesn't drop stuff instantly, you might still be in the process of perceiving a past event when you begin perceiving a subsequent event.

**Human vision is literally smearing over the moments where the phosphors are dark with the bright moments. This is a good thing, because if our vision had the temporal sharpness to cleanly pick out the dark moments, we'd see the flicker pretty raw, which would make CRTs and film projection obnoxious enough to be basically unwatchable.

***Essentially the same reason that stroboscopic ghosting occurs, but over more continuous imagery.

//=================

Anyway, we've been going in circles literally all day, and I'm not going to respond to another "you're claim is contradictory because high framerate video has less motion blur" post.
 
It's been said before. 60fps for gameplay is fine, the real problem are the cutscenes.

I definitely think this is true. I was playing the definitive edition of Tomb Raider on my ps4 and the cutscenes at first were jarring. The gameplay is fine at 60fps, but the cutscenes for some reason look really, REALLY fucked up.
 
I've said that exact thing multiple times, and I'm not sure what you think you're responding to.

I never, EVER claimed that high-framerate video tends to use more motion blur frame-for-frame. My claim is that, if the human visual system were to view absurdly (like, SUPER absurdly) high-framerate video, the time-averaging effect of vision would produce a motion-blurred result through exactly the same mechanism by which it produces motion blur when looking at "real" moving objects.

As I said before, if you have evidence to your views please present it. A forum thread filled with speculation isn't evidence. In fact, the thread contradicts you by suggesting that a higher framerate will get rid of the blur.

As it stands right now, the brain can tell the difference between actual moving objects and a flickering screen. At no point does the brain add motion blur to moving objects in video footage, regardless of framerate hence why developers have to add it as a post processing effect.

I definitely think this is true. I was playing the definitive edition of Tomb Raider on my ps4 and the cutscenes at first were jarring. The gameplay is fine at 60fps, but the cutscenes for some reason look really, REALLY fucked up.
Yes, the extra information destroys the suspension of disbelief.
 
There was a cutscene in Infamous Second Son where Delsin was talking to somebody. When the camera was on Delsin, the city was behind him. When the camera swung to other guy, there was nothing but sky. So the framerate kept going from ~45 to 60 back to ~45 back to 60 and it was pretty damned wonky.
 
In fact, the thread contradicts you by suggesting that a higher framerate will get rid of the blur.
The blur in question was caused by trying to eye-track an object (which causes your eyes to continuously move relative to your head) while your rotate your head (thus moving the physical screen). This means that the viewer's eyes are moving relative to the screen. Since the screen isn't changing over the course of a frame, the motion blur in question was caused by the actual motion of the screen relative to the eye. A higher framerate would help because the image on-screen would be rotated more often to its correct position, and so the movements of the eye-tracked on-screen object relative to the eye would be smaller.

The blur in question is an "incorrect" effect, it's produced by other sources of motion than what I'm claiming would produce motion blur, and my assumption has no trouble explaining why it happens.

At no point does the brain add motion blur to moving objects in video footage, regardless of framerate hence why developers have to add it as a post processing effect.
That's because real-world framerates that are currently in use are vastly lower than what I'm referring to, which I've pointed out repeatedly.
 
Eh, 60 fps looks really weird to me. Like, I'll probably get used to it, but its definitely weird seeing such a massive jump. I was honestly fine with the Hobbit fps jump, but I have a few friends that thought it was kind of weird.
 
That's because real-world framerates that are currently in use are vastly lower than what I'm referring to, which I've pointed out repeatedly.

And we go right back to the beginning. As shown by high speed camera footage, no, higher framerates = less motion blur because the exposition time for each frame is smaller. As I asked before, if you have proof to the contrary, please submit it.
 
And we go right back to the beginning. As shown by high speed camera footage, no, higher framerates = less motion blur because the exposition time for each frame is smaller. As I asked before, if you have proof to the contrary, please submit it.

It think you are both arguing different points. :-P

You are saying correctly that if you filmed something at an absurdly high framerate and paused the film and looked at a single frame. That image would have NO motion blur.

Then the argument being correctly made is that if that film is being played back and there is a fast-maving object in it. The blur your eyes perceive would be identical to watching the same scene directly in real-life.

Your both right but talking about completely different things!
 
Cutscenes should be 24fps - Filmic, cinematic, whatever.
Gameplay should be 60fps

If you're talking about within the same game, that's the worst.

This would look really jarring. I already don't like games that go from 60 to 30 fps but 24 would be even worse...

Yeah really.

Games are a different medium than film, so they shouldn't be crippled in terms of performance to make them feel like something they are not.

If everything is being rendered in-game there's absolutely no point to arbitrarily cut the frame rate in half or more for cutscenes. What's the point? To make people think that they're watching a movie all of a sudden? Let the game render everything consistently if there's no technical reason to limit it. You're seeing the same assets you see during gameplay, but now because you're no longer in control, you're ok with the visual presentation being dramatically altered? That makes no sense to me.

Now if you're talking about FMV cutscenes, then there's no choice. But FMVs are terrible space-wasters and in-game graphics are good enough in game now, so there's no reason to even use FMVs.
 
Top Bottom