In defense of the "filmic" look.

And isn't Journey coming to PS4 (presumably in 1080p/60fps) or is that just me dreaming?

They want as far I remember, but it is up to Sony. Maybe something at E3. Hopefully they prevent it and stay true to the premium cinematic experience. They are already ruining Last of Us for these 60 fps sickos.
 
tumblr_m6pequt15v1ry10fwo1_250.gif
my leg!
 
People are just used to movies being 24fps, but if all movies were 60fps everybody would hate anything less. Same goes for gaming, after playing a lot of 60fps games and then playing the last of us it was an eye sore.
60fps is simply better than 30fps and it isn't an opinion but a fact. The argument that 30fps is more cinematic is bs and is equivalent to MS talking about the "power of the cloud".
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.
 
Isn't it partly because games are in real time, so they have to account for things like refresh rates and synchronization, where as video is already pre-rendered?
 
this kinda reminds me of how I had to explain and SHOW the difference between SD and HD at my brothers house to literally 4 people on their 50 inch HD tv with FIOS. The difference is day and night to me but they swore up and down that there was no difference.
 
If you move your hand close to your face you'll see a lot of motion blur or eve if you just move your head around.I guess that is where people get confused.

No, it is not even possible for a screen to have a lower amount of blur than any object moving across your vision. The blur of an object is caused by the rods and cones in your eye changing their state (thus the charge they send down your optic nerve), the very same thing will happen with anything you are looking at (including a screen).
 
this kinda reminds me of how I had to explain and SHOW the difference between SD and HD at my brothers house to literally 4 people on their 50 inch HD tv with FIOS. The difference is day and night to me but they swore up and down that there was no difference.

These people, rampant in society, scare the living shit out of me.
 
Nothing wrong with hollywood envy. They just need to actually pull it off instead of getting these amazingly diverse HACKS that write everything for most studios. Why do they hire these people I have not a clue.
 
I really don't like the way movies look in 60 fps. It looks so weird, like it's too real, I'm not sure exactly why it feels so off but maybe it's because it just doesnt have that blurry movie effect. And no it really isnt the same thing as playing a game. Playing a game in 30 fps is a pain when you know what 60 feels like, but watching a movie in 60 is just so odd. I'm gonna have to get used to it but I dont like it.
 
Ocarina was great in spite of its framerate, not because the 20 made it look more cinematic though. If OoT ran at 60 fps it would have been that much better.
Maybe, maybe not... how knows :P

BTW I never found OoT unplayable (the opposite the gameplay was really great)... I'm just pointing that some claims like "less than 30fps is unplayable" is false... what make the game unplayable is the drops below 30fps... you can have a really great game playable at locked 25fps or even less.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

Its not necassarily what I want, as I said I'm more playing devils advocate, but it bothers me that people don't understand what they mean by cinematic look.

People are just used to movies being 24fps, but if all movies were 60fps everybody would hate anything less. Same goes for gaming, after playing a lot of 60fps games and then playing the last of us it was an eye sore.
60fps is simply better than 30fps and it isn't an opinion but a fact. The argument that 30fps is more cinematic is bs and is equivalent to MS talking about the "power of the cloud".

I guess id have to see it to believe it, and btw its not even remotely the same thing. People all over from the best filmmakers in the world to just your average joe have targetted that 24fps feel for decades now. We have more than enough technology to change it, but most prefer the look of 24fps. Whilethis doesn't neccasarily translate directly to games, its stupid to straight up ignore the idea of it. People are being so damn elitist about it.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

That looks good. I'd love to try that in real time.
 
I didn't count, but I don't believe that the first webm was at real 24fps. I mean, N64 games used to be sub 20 fps and they didn't look stuttering like the webm.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

That's a very high quality filmic motion blur. You could use a lower cost version with today's consoles for a very passable effect. Even Tekken 6 and TT had a nice motion blur effect (and were 60fps games).

http://www.youtube.com/watch?v=8RV6QuiLe1s
 
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?
There's a (vast) difference between playing a game regardless of its performance when not afforded the choice of a higher framerate or resolution, versus actively preferring a lower framerate due to a subjective notion of it being more "cinematic".
 
High framerate sucks in movies. Just like all those "true motion" effects it gives movies and Tv shows that "soap opera" effect.

I think as far as gameplay goes 60fps will allways be superior. Controls feels much more responsive at high frame rates.As for how it looks... I'm personally not a fan.

If we are talking about an online shooter I will allways preffer for them to get a good framerate rather than mindblowing graphics. But if we are talking about a graphically heavy single player game that goes for the "cinematic" experience like, say, Uncharted. I have no problem with them pushing the graphics as far as they can at the cost of running the game at 30fps.


Also, I don't get why some people hate "cinematic games" so much.
Sure, I'd also hate it if every game was like that. But a well made "cinematic experience" every now and then is fun.
 
No, it is not even possible for a screen to have a lower amount of blur than any object moving across your vision. The blur of an object is caused by the rods and cones in your eye changing their state (thus the charge they send down your optic nerve), the very same thing will happen with anything you are looking at (including a screen).

You completely misunderstood what I said.

Regardless, the nature of motion blur is completely different in game when you are looking at a screen, if you play an FPS that doesn't artificially introduces motion blur you can clearly see that.
 
High framerate sucks in movies. Just like all those "true motion" effects it gives movies and Tv shows that "soap opera" effect..
lol.

The "true motion" effect you speak of just makes the in-between frames up, better known as "Interpolation". It's fine for panning but breaks whenever anything moves to quickly which is why you'll usually see certain things moving at their original 24FPS rate while other slower moving things are moving at 60FPS. So of course it looks like ass. Also, this stuff was obviously shot and edited with 24FPS in mind.
 
But movie-like motion blur isn't possible in games. You can't demand a movie-like experience with today capabilities.

For example, here is a Portal 2 video running at 24fps with proper movie motion blur:
https://www.youtube.com/watch?v=cKRE6yG74a4

This is what the OP wants, and it's not that bad in terms of visuals (not counting gameplay). The problem is, this ~2m video took about 2 hours to render. It's just not possible to render all that per object motion blur in real time, so we can never get a true movie-like experience in games. Therefore, demanding 24 fps in games because all that "movie" feel is just wrong.

Here's the thing: The video provided basically proves the premise of framerate mattering less than image presentation. The idea that the current technical implementation isn't precisely on par with a real-world equivalent is no different than bump-mapping being an imperfect way of reproducing texture on a surface. Games with high-quality, object-based motion blur have been on the market for years. Project Gotham Racing 3 and Crysis jump to mind. Getting closer and closer to proper cinema-quality precision advances with the hardware behind it. You just provided the proof of concept for why the premise of a "flimic" look can work. Well, that and how good the Order has looked in presentations.

This next statement is directed toward the discussion, not you in particular. I don't want to give the impression that frame-rates shouldn't strive to match the highest refresh-rates our displays can manage (there's a reason I prefer the Forza series for console racing titles), but acting as though there isn't a reasonable rationale for why higher-quality visual effects and detail in a scene might take priority in some titles over the highest of frame-rates ignores the realities of modern visual design.
 
Here's the thing: The video provided basically proves the premise of framerate mattering less than image presentation. The idea that the current technical implementation isn't precisely on par with a real-world equivalent is no different than bump-mapping being an imperfect way of reproducing texture on a surface. Games with high-quality, object-based motion blur have been on the market for years. Project Gotham Racing 3 and Crysis jump to mind. Getting closer and closer to proper cinema-quality precision advances with the hardware behind it. You just provided the proof of concept for why the premise of a "flimic" look can work. Well, that and how good the Order has looked in presentations.

This next statement is directed toward the discussion, not you in particular. I don't want to give the impression that frame-rates shouldn't strive to match the highest refresh-rates our displays can manage (there's a reason I prefer the Forza series for console racing titles), but acting as though there isn't a reasonable rationale for why higher-quality visual effects and detail in a scene might take priority in some titles over the highest of frame-rates ignores the realities of modern visual design.

I would be really interested to see the above effect used in 60fps gameplay footage, perhaps recorded at something like 960fps.
 
People are just used to movies being 24fps, but if all movies were 60fps everybody would hate anything less. Same goes for gaming, after playing a lot of 60fps games and then playing the last of us it was an eye sore.
60fps is simply better than 30fps and it isn't an opinion but a fact. The argument that 30fps is more cinematic is bs and is equivalent to MS talking about the "power of the cloud".

I totally agree with this. For folks that have a 128hz TV, they already get a pseudo high framerate picture. I remember the first time I saw a TV like that it caught me off guard and other folks commented on how things looked too real. However, I got used to it pretty quickly and now it's my preferred method of watching TV/Movies. 24fps has its merits and certainly brings a certain tone to a film, but it has severe limitations most specifically panning. My god, I don't know how any director worth his salt could put a jittery, seizure inducing panning shot in their movie. It is like fingernails on a chalkboard for your eyes.

A higher framerate in film is something I always wanted to try but figured was a lost cause until Peter Jackson's Hobbit in HFR. I know there was a lot of complaints about HFR, but for me, it was a magical experience. Not only did it make 3D watchable, it increased the immersion of 3D beyond gimmick and the real win was the glorious smooth pan shots.

Higher framerate is even more important with gaming due to the user feedback. I think a rock solid 30fps is tolerable and the game can be balanced around it, but 60fps is always preferable, even for a game that is supposedly "filmic".
 
Thoughts from a friend of mine:
I wanted to say a few things about TV, film, games, framerate, and even a little bit of opinion. I hope you'll forgive the long entry. I'd like to clear up a lot of confusion and misinformation that often gets sent around when this topic get brought up.

First of all, the human eye doesn't see in frames. The experience of what you see or 'photo-realism' or whatever you want to call it can not be replicated by an external source. What you see in your eyes to your brain can only be experienced by them. No photograph, computer render, or estimation of motion can ever accurately portray your perceived sense of reality. It's not real and can never be real. All we can do is emulate it through feeble human means.

Even though the human eye doesn't see in frames, it can still see hundreds of frames per second. Pilots have seen a flash of a plane at 200+ frames per second in simulators. Even though 60fps is considered great and the golden standard these days, PC gamers have run at 120fps or higher back in the CRT days. Despite diminishing returns in the power needed to render above 60, you can see a great deal of increased smoothness and responsiveness over 60Hz.

This is not well known, and leads to a lot of confusion, but even though film is 24 frames, it is shown at 48 frames in cinemas. Every frame is flashed twice. This 'persistence of motion' only happens at around 40 flashes per second or higher. Anything lower and the effect flickers and the illusion is destroyed. There's a reason 50Hz and 60Hz are used for TV. They are above the threshold, and just because film is 24fps does not mean it is good enough for smooth playback on its own.

Did you know that the electricity that comes out your wall sockets is also 50Hz/60Hz? This isn't a coincidence. If it was any lower lights would flicker disturbingly so to your eyes. TVs actually use the electrical frequency to keep the image in sync. This is why TVs had static refresh rates of 50 and 60Hz respectively in their regions. Sound familiar?

The older TV standard did not have enough over-the-air bandwidth to display 60 progressive frames, and 30 wasn't enough to keep the image flicker free, so the vile, but at the time needed, invention of interlacing was born. By displaying 60 half fields, it appeared there was 30 frames on your old CRT TV, but in actuality, you weren't seeing 30 or 60 frames. The result, however, was a smooth image in the limitations.

It gets even more complicated when color TV was introduced. To make the new standard backwards compatible with black and white TVs, but to include the new header information for color TVs, the frame-rate was actually reduced slightly. This is so color TV sets could decode the chroma, while black and white TVs should just ignore it and display luma only. This oddity still exists in all modern displays and we still don't have true rounded frame-rates. When we say 30 we mean 29.97 and 60 is 59.94.

One of the problems of converting 24 frames of the film standard to TV is judder. Judder is this 'hiccup' effect that you see every second when frames are not on screen for a uniform amount of time. You can not put 24 into 30 or 60 without compromise. There's a trick used to duplicate every '5th frame' called inverse telecine that introduces this problem, since each fifth frame is held on screen twice as long. Newer TVs with 120Hz refresh rates can flash every frame the same 5 times and eliminate this judder. In PAL territories this was even worse. 24 into 50 did not have an easy solution. Many PAL transfers of films have the sound speed up by 4% to match the offset of eliminating 1 frame every second creating voices slightly higher pitched. This is also why many old games ran 18% faster from their NTSC counterparts to make up for the difference in frame-rate. Thankfully these issues are mostly behind us, and I hope we have a future where our displays devices are not locked to any frame-rate similar to G-Sync and FreeSync.

I want to make it clear that 60 frames per second is not actually fast forwarded. You are not used to the increased smoothness of 48fps HFR or 60fps video, thus it looks faster, but it is NOT faster. Objectively the argument that higher frame-rates look worse is bunk. We prefer 24 and there is NOTHING wrong with that. Art is subjective, and there are many compelling arguments for the less is more crowd, but it ultimate is dependent on what the artist and audience likes or wants. There have been many superior technologies that the public have shunned because they didn't like how alien it was.

I also want to point out that the 'soap opera effect' that you see can refer to several things. It is usually referring to 60i/p video that looks very smooth. It can also be the interpretation filter on most 120Hz TVs sold today. This interpolation is not film and it is not video. It is a trick that does a cheap job of creating fake in-between frames of 24, 30, and 60 frame per second video and up-converting the perceived motion to 120Hz. It blends two frames to smooth movement, but doesn't actually increase the amount of information the viewer is seeing. Do not confuse this interpolation effect on TVs to be an indicator of what is good or bad compared to a native frame rate.

There have been many different frame rates for film, but 24 became the standard. There doesn't have a standard for games either if that's what the creators want, but that shouldn't be based off what a game looks or feels like; it should be what's best for the game.

The argument that certain frame rates are better than the other when it comes to passive media is petty, and different frame-rates for film and video can co-exist. However, games are not movies. Film and TV are not interactive and requires no input. You can like or dislike a frame rate in a game based on aesthetics, but the most important thing is input control. You don't want games with low frame rates because input is important to interactivity and response.

However rules are meant to be broken and things can be flexible. Not all games require lightning fast input, and in the future we might even be able to make a game run at 30fps or lower and have input that is high precision separate of frame rate. However none of this is taking into consideration other factors like motion blur added by displays or input lag that we currently deal with.

I must admit I am very worried and disappointed in the discussion that games go in these days. Even though I love helping people be informed about things, I'm grow weary of always seeing technical and artistic discussions. Besides filmic not a being a word, the game industry, journalists, and consumers seem obsessed with games being film in everything from appearance to execution. I feel they do not understand the pros and cons of the medium, and are insecure when they constantly go out of their way to imitate or praise another medium that is considered 'established, accepted, and familiar.'

What games convey can not be presented through screenshots, video clips, graphics, technical techniques, etc. While presentation is an important part of an experience, games are at their core, interactive experiences, with rules and goals. You do not want to take that away or dumb it down, but we heavily restrict what games are and should be in favor of a controlled system that strips away most advantages the interactive medium has in favor of the rules of another completely different medium.

For video games to actually be understood and accepted we need better critical analysis and tools for discussing what makes a game good, and they do not include the overwhelming discussion of graphics, technical systems, hype, marketing, and other superficial qualities. If all you ever see of a game is the developer talking about their amazing engineered systems for making leaves blow in the wind, how it's like a Hollywood Blockbuster that looks cinematic, or how it's photo-realistic, you should give serious pause to what is important to them. It's almost certain they blew more of the budget on getting the game to look great instead of playing great.

Gamers today don't seem to want games. Games aren't passive entertainment, but that's what people seem to want. I definitely feel like the industry has turned into making ultra-linear $60 movies where you press up to watch with a larger myopic focus on low risk, high budget, high return, and I can't help but feel this approach is eating the industry alive and more importantly giving us mediocre games. In turn in order to make up for the lack of an actual game worth owning and replaying they have turned to DLC, pre-order bonuses, and tacked on multiplayer modes.

I don't blame some people though. It's understandable. You come home from a long hard day of work, and you want to relax. You aren't a kid anymore and you don't have a lot of free time but a lot of disposable income. But the thing is, games aren't passive. They aren't like movies or TV. They require you to engage them. We need to be part of the solution and stop being lazy. Games are hard work and require effort, thinking, and learning.

We don't need any more shallow Game of the Years that pass off bottom of the barrel design such as moving ladders, planks, and palettes as the standard of quality because anything else would ruin the immersion. We praise the amazing character development, impressive physics and animation systems, carefully crafted cinematic, linear narrative, the emotion, and how it's raising games to art, completely ignoring that none of that is game-play. You know, that thing that makes a game a game. Not only is it not game-play, you never seem to hear about how fun a game was, how well it controlled, or how well the level design was. We need games that put game design first and everything else second. Games aren't even good at telling linear stories because games by their nature have so much more potential to go beyond linearity.

The first game ever made, "Tennis for Two," was art. You know why? Because anything a person creates is art. It doesn't matter how bad it is or how much someone don't like it. Someone created something. Video games are art and always have been, and just because someone might be arrogant to tell you something you love and enjoy isn't art, does not make it any less so. We must develop thicker skins and reject the belittlement that our hobby is inferior to others.

A good film has a good story. It doesn't matter how flashy the special effects are, we won't remember it. A comic like Cyanide & Happiness may be crudely drawn, but its content comes first. It's memorable, and you never get hung up on the fact that they're stick figures. You'll never remember that a shoddy webcomic that was shallow no matter how painstakingly beautiful it was. These examples should help remind you that gameplay should always come first, and it's the reason we can go back and play gems like Super Mario 64, Sly Cooper, and other mechanically driven games even if they don't need a small nuclear reactor to run. I don't believe for a second that many modern games will ever be remembered as fondly as a game that puts game design last. How many people talk fondly about Night Trap and still play it? Don't forget that at one time it was state-of-the-art. With games being so expensive, time consuming, nothing is more important than selling a lot of copies. That means less focus on gameplay, and more focus on marketing which includes impressing people with tactics like buzz words. A good game needs good gameplay, and not even reviews seem to understand this, which is highly problematic for anyone looking for advice on how to spend their money.

There's too many games that ride the coat tails of amazing visual fidelity. They promote all the wrong things with games because it's easier to trick people into being something excited for lipstick before they realize it's on a pig. We need to be smarter and choose harder with what we buy, because it doesn't matter how good or bad the game is as long as they get your hard earned money. I know this last part is off topic, but I felt it was all connected and didn't know how else to say it without essentially adding unwanted flame-bait to a topic where people are looking forward to these types of games that I feel are focused on flash over substance. Games that get everything right in terms of gameplay and presentation are rare, so I'm skeptical. It's time to look into being more critical, educated, and to look past cheap tricks. We must start looking at what games do best. They are games. It's nothing to be ashamed of.
 
The game could do that instead of the tv. Problem solved.

Erm. No. You'd still be getting motion judder from showing one frame three times and the next frame two times.

If you didn't do that, you'd have constant tearing, which would be just as bad.

The amount of wrongness in this thread is jaw dropping.
 
Thoughts from a friend of mine:
Ye Gods, I'd be happy to buy you and your friend a stiff drink :)

That was both informative and very well put. Your friend understands the beating heart of the the medium and the challenge at its core for publishers, developers and players alike in embracing that interactivity.
 
Ye Gods, I'd be happy to buy you and your friend a stiff drink :)

That was both informative and very well put. Your friend understands the beating heart of the the medium and the challenge at its core for publishers, developers and players alike in embracing that interactivity.

I got to the part where he says they play every frame twice in a movie and my stupid meter filled all up.

This would be the first I've heard of that and I've worked in film for 15 years. I imagine that post of full of other goodies he picked up around the web.

Thread is classic GAF.
 
Movies can have perfect motion blur. It means that not a single frame is a still frame. Every frame is a delta of many frames. So you could say (almost) that there are much more than 24 frames per second of information in there. A frame contains movement.
That's not possible in a realtime engine unless you render everything with a hefty (and horrible) delay.
That's why games NEED higher fps than movies. Case closed.

The most amazing post on the issue yet.
 
I got to the part where he says they play every frame twice in a movie and my stupid meter filled all up.

This would be the first I've heard of that and I've worked in film for 15 years. I imagine that post of full of other goodies he picked up around the web.

Thread is classic GAF.

Hmmm. My only exposure to this stuff was an 'Intro to Film' class in college, but I could swear I remember them talking about the flicker fusion threshold and how the shutter actually operated at 48hz because 24hz was below the flicker fusion threshold for the majority of people.
 
There are some hilariously uninformed posts in this thread (unsurprisingly, mostly from the 24fps is great! camp).


1d4.png
 
Hmmm. My only exposure to this stuff was an 'Intro to Film' class in college, but I could swear I remember them talking about the flicker fusion threshold and how the shutter actually operated at 48hz because 24hz was below the flicker fusion threshold for the majority of people.

First I've heard of this too and it smells like confusion to me. They're possibly talking about shutter angle rather than frame rate. 24hz or 24fps was chosen because it looked smoother 16fps, even though research showed that 16fps was enough for images to fuse into animation. Heck I think they decided 12fps was enough at one point.

I got to the part where he says they play every frame twice in a movie and my stupid meter filled all up.

This would be the first I've heard of that and I've worked in film for 15 years. I imagine that post of full of other goodies he picked up around the web.

Thread is classic GAF.

Good to see fellow filmmakers on GAF.
 
Believe it or not filmic IS a real word. If it wasn't I'd have been shooting it down beforehand.

And even if he got some of the technical details wrong I still think he makes a good point: there does seem to be some games that are more concerned with being these cinematic experiences or whatever rather than actually making interesting games. That's kind of why The Order set people off though, it's the most naked attempt at focusing on this story driven focus we've seen of any game that's still ostensibly trying to be a more gamey game like a shooter. So for those of us tired of that approach it can be an easy target, whether we full on attack or just go "uhh, is this really the best way to go about all of this?"
 
Like, real life movement feel natural. It just happens. You don't notice a framerate.

24fps movies are like that. When people move it feels like how my eyes see real people.

60fps is more like "look at me I'm sooo smooth!" Feels artificial

Your tag must be some variation of "Sees the world in 24fps". This is just a remarkable post from every angle
 
Your tag must be some variation of "Sees the world in 24fps". This is just a remarkable post from every angle
Looking through his post history more thoroughly... yeah, he's screwing with us, haha.

That, or something broke his mind in the last few months.
 
Thoughts from a friend of mine:

I'm sorry but this opinion is slightly uninformed. I haven't read the whole thing because it's long. But I got up to this point and I have to say it is just plain wrong:

This is not well known, and leads to a lot of confusion, but even though film is 24 frames, it is shown at 48 frames in cinemas. Every frame is flashed twice. This 'persistence of motion' only happens at around 40 flashes per second or higher. Anything lower and the effect flickers and the illusion is destroyed. There's a reason 50Hz and 60Hz are used for TV. They are above the threshold, and just because film is 24fps does not mean it is good enough for smooth playback on its own.

Very misleading. This is the wikipedia article (http://en.wikipedia.org/wiki/Persistence_of_vision):

Modern theatrical film runs at 24 frames a second. This is the case for both physical film and digital cinema systems.

It is important to distinguish between the frame rate and the flicker rate, which are not necessarily the same. In physical film systems, it is necessary to pull down the film frame, and this pulling-down needs to be obscured by a shutter to avoid the appearance of blurring; therefore, there needs to be at least one flicker per frame in film. To reduce the appearance of flicker, virtually all modern projector shutters are designed to add additional flicker periods, typically doubling the flicker rate to 48 Hz (single-bladed shutters make two rotations per frame – double-bladed shutters make one rotation per frame), which is less visible. (Some three-bladed projector shutters even triple it to 72 Hz.)

There is no 48fps in cinema except for Peter Jackson's uglyfest. It is ONLY necessary to flicker at 48hz or even 72hz because of the way film (as in celluloid) works. It is something to overcome the limitation of that technology but the end result is still exactly the same: 24 frames per second.

In digital cinema projectors, there is no need for these shutter flicker periods because you no longer move celluloid film from one position to another. It's just a digital sensor that refreshes at... 24 times per second.

So for all intents and purposes, using the flicker speed is COMPLETELY irrelevant to this discussion and absolutely misleading.

The best, short and elegant post in this thread was the one I quoted earlier from Horp, talking about the idea of a cinema frame being different than a video game frame.
 
Think you should redo this. Doesn't look like you're using motion blur either. I think it would help at a lower frame rate. I locked mine at 24 and it didn't seem that choppy. I even ran around, drove around, shot the place up. Was decent enough, not really playable for a whole though.

The video even hitches around 11-12 seconds.

You think gaffers with vendettas give proper accounts?
 
Top Bottom