In defense of the "filmic" look.

TVS. AND. MONITORS. THE DIFFERENCE IS IN THE DISPLAY NOT THE SYSTEM.

I posted on that earlier, and I've actively tried hooking a computer up to a TV before to see how it'd look. That's where the difference actually is. Consoles are just specialized computers anyway.

That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.

I don't know how many times I can say this... the argument I've presented is not about how a game plays, only the idea of the "film look" or "feel", 24FPS would not look bad at all if it was built properly. I'm not saying they should strive for it, all I'm saying is that the look and feel would change.
 
As the person playing, it was absolutely playable. Not as great as 60fps of course, but I was able to play. I was even able to parry an enemy, and dodge through the turtle enemies swings.

It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
Yet that extra 6 fps makes a huge difference there being a 25% increase, and 60's a multiple of 30 rather than 24 so the end result will come out better (then there's 120hz, but man why get a computer monitor at that refresh rate only to play games at 24 hz on it?) Yeah, it's not the worst thing or anything but on current hardware it seems kind of stupid to go for it, and it'd probably hit doubly hard in racing or shooting, at least Dark Souls isn't really that fast paced of a game.
 
(A quick review of our conversation):

Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.

Guess what, video games aren't going to employ those film techniques either, so that's a moot point.

Motion blur is a standard visual effect in videogames. Done correctly, it drastically improves the apparent fluidity of motion.

Yes, and its done for games currently at 30/60.

You know what else will help fluidity? Higher framerates for you games.
 
I assume you mean lighting? And I don't know anything about it really, but am pretty sure that isn't the case. If it were, something at 24fps with the "wrong lighting" would have the soap opera effect as well, no?

Lol, yeah, fucking phone. But no, it is indeed the lighting. Soap opera has the lowest quality, everything about it is cheaply made.
 
Yet that extra 6 fps makes a huge difference there being a 25% increase, and 60's a multiple of 30 rather than 24 so the end result will come out better (then there's 120hz, but man why get a computer monitor at that refresh rate only to play games at 24 hz on it?) Yeah, it's not the worst thing or anything but on current hardware it seems kind of stupid to go for it, and it'd probably hit doubly hard in racing or shooting, at least Dark Souls isn't really that fast paced of a game.

It might sound crazy, but I do have a 120hz monitor, and 24fps content like movies and TV shows (as well as games at 24fps!) look a lot better on the monitor, because 24 is a multiple of 120, and every frame is shown exactly 5 frames, rather than being showed inconsistent times.
 
Lol, yeah, fucking phone. But no, it is indeed the lighting. Soap opera has the lowest quality, everything about it is cheaply made.

It's not the lighting, the majority of the problem comes from the 60fps. The lighting is bad, but would like more "film like" at 24FPS.
 
The second Hobbit film, where they adjusted the color grading, already looked far more "filmic" than the first. HFR films only have room to improve, and it's a much bigger improvement to cinema than 3D was.
 
60FPS has always been the gaming standard, and instead of comparing video games to films we should compare them to other games like televised sports. And in TV land sports is captured at 60FPS, the major networks even chose 720p over 1080i not because of cost but because it's better at fast motion. Even games that feel more like story experience such as The Walking Dead benefits from 60FPS.
 
It's not the lighting, the majority of the problem comes from the 60fps. The lighting is bad, but would like more "film like" at 24FPS.

You seriously don't realize that soap operas look considerably worse than anything on TV, also shot in 60 fps?
 
How many fps does real life run in?

My brain is still running an old intel chip and I stupidly bought one of those 'all in one brain' meaning it's not easy to upgrade :(

Also video games require input. Movies do not. This has been made a thousand times before. I don't mind 30fps but some genres I just can't stand them.

I quite enjoyed TLOU simply because it's gunplay felt so "weighty".
 
You seriously don't realize that soap operas look considerably worse than anything on TV, also shot in 60 fps?

Did I say that? The "soap opera" effect refers to motion interpolation or motion flow on tvs, which gives movies and tv that 60FPS look. You seriously didn't know what the soap opera effect refers to?
 
I don't know how many times I can say this... the argument I've presented is not about how a game plays, only the idea of the "film look" or "feel", 24FPS would not look bad at all if it was built properly. I'm not saying they should strive for it, all I'm saying is that the look and feel would change.
I'd been having trouble thinking of a response to this, but perhaps it's because there really isn't much to say. Yeah, you can give it a distinct feel, and make it look good still. Which is why movies still went with it even with CGI and going from film to digital where the old hangups no longer applied. There really isn't much more to say there, until you go into the fact games ARE ultimately an interactive experience and thus there's the question of if it can ever be acceptable, and when. And most likely that's only for games that actively are trying to be films, and not like the Order but like a David Cage game or LA Noire while investigating and not shooting. Shooting you absolutely need 30+, feels terrible trying to aim below that.
It might sound crazy, but I do have a 120hz monitor, and 24fps content like movies and TV shows (as well as games at 24fps!) look a lot better on the monitor, because 24 is a multiple of 120, and every frame is shown exactly 5 frames, rather than being showed inconsistent times.
Ah crap you are crazy!

But yeah I kind of expect that for TV/movies that monitor really could be better, at least in regards to motion. Maybe not necessarily color or IQ, but being an exact multiple for sure rather than this pull down stuff must have a subtle yet significant improvement. Games I'd only do it for morbid curiosity at 4k downsampled with full AA on or something, and not something I'd stick with over 120.
 
You seriously don't realize that soap operas look considerably worse than anything on TV, also shot in 60 fps?

Right, but I would see this "cheap" effect in any movie/tv show running at 60fps. If I watched Indiana Jones (or any other movie) at 60fps it would look like it was shot by an amateur with a shitty home camera to me. If you've ever seen The Blair Witch Project or anything where they "film" something within the movie you'll understand what I mean. The movie itself looks like a quality movie, but while we're looking through the cameras they're filming with in the movie, you get the "cheap" look.

Again though, I don't see this cheapness in a 60fps video game.
 
Now this is infuriating when somebody does this. How about you not assume I didn't read the thread when I did?

Then you're reading comprehension skills are struggling. try to grasp what the core argument is in this thread. it's not about what is better for gaming, it's more about defending the idea of a different feel or look achieved by a lower fps. That effect does in fact exist.
 
I'd been having trouble thinking of a response to this, but perhaps it's because there really isn't much to say. Yeah, you can give it a distinct feel, and make it look good still. Which is why movies still went with it even with CGI and going from film to digital where the old hangups no longer applied. There really isn't much more to say there, until you go into the fact games ARE ultimately an interactive experience and thus there's the question of if it can ever be acceptable, and when. And most likely that's only for games that actively are trying to be films, and not like the Order but like a David Cage game or LA Noire while investigating and not shooting. Shooting you absolutely need 30+, feels terrible trying to aim below that.

Ah crap you are crazy!

But yeah I kind of expect that for TV/movies that monitor really could be better, at least in regards to motion. Maybe not necessarily color or IQ, but being an exact multiple for sure rather than this pull down stuff must have a subtle yet significant improvement. Games I'd only do it for morbid curiosity at 4k downsampled with full AA on or something, and not something I'd stick with over 120.

Well there you go, this is my point exactly right on the damn head. The idea that games are unplayable at 30FPS is a total fallacy anyways, some of the best games of the past generation were 30FPS, while also achieving better visuals, and in some peoples minds a better form of storytelling.
 
My brain is still running an old intel chip and I stupidly bought one of those 'all in one brain' meaning it's not easy to upgrade :(
I still make those 28.8 baud modem connection noises whenever I think too hard about something.

Even still my brain is running 120fps minimum, because I turned all the settings down. You're all 2D sprites to me, communicating in text.
 
Films aren't 24fps. They may be displayed at 24 fps but they are not recorded at 24fps. Not in the way that video games measure fps. In a video game, a frame is an instant snapshot of the current state. That is not how a camera works. You can not take an instant snapshot of the world. One frame in a camera consists of an "infinite" amount of frames all composed together into one. The film continually receives information for the entire duration that the lens is open.

To make a video game like film, you need to render it at "infinite" fps but then display it at 24 Hz. The solution of "motion blur" by transposing one frame over the previous one is not camera motion blur. You need to transpose an "infinite" amount of frames within one frame for it to be camera motion blur.


And regarding what fps reality runs on. It depends on whether you mean the world or your perception of the world. Your perception of the world runs at a different "fps" based on where in your line of vision you're talking about.

If we define fps as the maximum potential amount of changes of state than can occur in one second, the fps of the world is planck length / speed of light = 1 / planck time = 1.855 * 10^43 Hz. That is, assuming we're in a vacuum (which we're not).
 
Did I say that? The "soap opera" effect refers to motion interpolation or motion flow on tvs, which gives movies and tv that 60FPS look. You seriously didn't know what the soap opera effect refers to?

Yeah, basically the term "Soap Opera effect" come from the days when Soap Operas were filmed on 60fps interlaced video cassettes to reduce the cost of filming each episode. Which gave everything that floaty motion blur look when played back through TV broadcasts.
 
As always, I'll just note that whenever I notice that a game is running in 60fps it looks weird and 'unreal' to me for quite some time before I get used to it. Sometimes it makes me nauseous. It was the biggest thing i had to get used to when I started playing PS4 - so many games at 60fps or close to it.

I have no problems with 30fps at all. I'm not sure that 'smoother' is better visually. It's definitely better for twitch inputs and responsiveness, but visually? I'm not sure. I know that's a pretty unpopular view around here.

Give me 1080p and AA over framerate any day.
 
Yeah, basically the term "Soap Opera effect" come from the days when Soap Operas were filmed on 60fps interlaced video cassettes to reduce the cost of filming each episode. Which gave everything that floaty motion blur look when played back through TV broadcasts.

Funny he had such snide "you seriously don't get it" remark, when he seems out of the loop.
 
60 FPS only looks "strange" in film because that isn't the standard. That's it. There's nothing in 24 FPS (or 30 for that matter) that looks any more "realistic" or "grounded".

60 FPS also looks significantly closer to real life, because it's much closer to how people perceive the real world than 24/30 FPS is. If you're going for "realism", 60 FPS is actually more realistic.



Yeah...that's just you. There's nothing artificial about 60 FPS.

I really hate this line of thinking. Home video cameras have been shooting at 60i for DECADES, same with broadcast cameras. It looks cheap, and not because of the resolution, or quality of the video.

Film guys on a budget have been doing clever things to get cameras shooting at 24p with a shallow depth of field in order to produce a very specific look for years, up until DSLRs could do that cheaply.

Now I'm not saying that 'film look' is necessarily better for video games (man, the first time playing Crazy Taxi, delicious), but to argue that the only reason people prefer it for non interactive video is because it's the standard is ridiculous.
 
... what?

The only time you see this "fast forward" effect is with LCDs using motion flow technology to repeat or interpolate extra frames to reduce judder. It is not the same as content running at a higher fps.

Movies have motion blur to remove judder. You dont need that in games as you can just render the frames faster and not use any motion blur.

I cant stand that Hobbit 48fps footage. But games at 60fps i have no issue with, much better than anything less.
LCDs that display using 300hz or more give me the fast forward effect, its not the same with games being rendered.

I cant even believe this is being debated.
 
I really hate this line of thinking. Home video cameras have been shooting at 60i for DECADES, same with broadcast cameras. It looks cheap, and not because of the resolution, or quality of the video.

You're going to need to back up your assertion that obviously cheap elements are not the reason something looks cheap.
 
With the type of TV I have I'll take the framerate, thanks. Nothing wrong with 30 FPS as it can work with some games but there's no arguing that a higher framerate can help with gameplay with some games.
 
Did I say that? The "soap opera" effect refers to motion interpolation or motion flow on tvs, which gives movies and tv that 60FPS look. You seriously didn't know what the soap opera effect refers to?

He was referring to soap operas. If you want to talk about the effect, fine.
You are indeed correct, the effect comes from motion interpolation. But this is ironically a problem for 24 fps and not 60 fps material, so I'm not really sure why you are bringing this up, it only weakens your argument.
 
The idea that games are unplayable at 30FPS is a total fallacy anyways.

some of the best games of the past generation were 30FPS, while also achieving better visuals, and in some peoples minds a better form of storytelling.

"When people have one opinion they're just wrong."

"When I have an opinion it's different."
 
When people have one opinion they're just wrong.

When I have an opinion it's different.

When an opinion is also an outward assertion, it can be wrong. Saying "I dislike X" is an opinion. No room for a counterpoint there. Saying "X is unplayable" is a different matter.
 
Well there you go, this is my point exactly right on the damn head. The idea that games are unplayable at 30FPS is a total fallacy anyways, some of the best games of the past generation were 30FPS, while also achieving better visuals, and in some peoples minds a better form of storytelling.
This is actually part of why I linked that GameSpot video and you'd need to watch whenever it's convenient, because there is a point where your brain's no longer tricked into thinking it's motion but perceiving it more as a literal slide show, and that's at around 17 fps. But even below that games can be playable depending on the kind of gameplay they're going for, I think Ultima VII was something like 5 fps and that came out on PC during the 16-bit years, when many people were playing SNES or Genesis with games that ran at 60 fps at least for scrolling.
"When people have one opinion they're just wrong."

"When I have an opinion it's different."
While there's opinions, you can go way before 30 fps before a game objectively becomes unplayable, and that depends on the genre. Something like Chess would be playable at 1 fps after all, even if there's no damn way any modern action game can be playable at that fps.
 
I don't know how many times I can say this... the argument I've presented is not about how a game plays, only the idea of the "film look" or "feel", 24FPS would not look bad at all if it was built properly. I'm not saying they should strive for it, all I'm saying is that the look and feel would change.

The look and feel are terrible. You like it solely because you are used to it. It is objectively worse.

You like it only out of ignorance and inexperience.

That's fine and all, but as a whole, you're helping make video games worse for everyone.

Go buy a 120hz monitor and give your eyeballs some candy. You will come back on the other side of this.
 
Depends. With passive film, yes 60 FPS looks weird. I even think cut scenes in games look strange at 60FPS. But when you have gameplay involved, 60 FPS is IMO absolutely superior. It doesn't look that strange, because you aren't passively watching, you are actually controlling the actions of the character, and reacting to things.

So that's why I don't really buy the "cinematic" argument at 30 FPS when it comes to actual gameplay. As a hardcore PC gamer for the last 3 years, I've played most games at 60FPS and above. It's always been a massive improvement over consoles. Honestly, if it just became the standard, I think most would come to accept it. It might be jarring for some at first, but it's an overall improvement. Not to sound like a broken record, but gameplay..gameplay..gameplay. It's the factor that kind of negates the 30 FPS cinematic argument.
 
You're going to need to back up your assertion that obviously cheap elements are not the reason something looks cheap.

What is cheap about broadcast quality cameras?

And if resolution is the reason 60i looks cheap, then a 480i CRT, or an ad being shown in a small online video window, would ruin that 'film look' (3:2 pulldown aside).
 
Film has motion blur and no rendering resolution for objects. That's why games running at 24fps look noticeably worse than 24fps film.
 
It's a lot about perspective. 60fps to me is absolutely sub-optimal and I try to maintain 90-96. If all games on consoles ran at 120fps and everyone had 120hz displays most of the forum would feel the same way about 60fps as they do now about 24-30.
 
I really wish I had been able to catch a viewing of the hobbit at double the fps over standard. I feel like this is one area where movies could definitely step it up, but haven't done so because people think it looks "weird". Hell maybe it does. I haven't seen it for myself yet, so it's hard to judge either way.

For games I prefer 60 over 30 at all times. All things being equal, there is no situation I can think of where I'd actually want 30 fps.
 
What is cheap about broadcast quality cameras?

And if resolution is the reason 60i looks cheap, then a 480i CRT, or an ad being shown in a small online video window, would ruin that 'film look' (3:2 pulldown aside).

Erm, we talked about soap operas. They are cheap. Filmed on tape, cheap lighting. And well, they look cheap. Now watch a news show. A football broadcast. Doesn't look cheap? Guess why.
 
No, it does not!

If you plug a controller (with it's much less accurate motion vs a mouse) into a PC, a game running at 30FPS will feel the same.
You forgot the TV. No seriously don't forget the TV.

I haven't followed TV LCD response times lately, but it probably has to do with the fact that last I checked TVs had slower LCD response time than monitors did, so the shift in colors can sort of create a natural motion blur, and thus "look better." Just like with film versus a game running at 24 or even 30 fps without any (good) motion blurring applied.
 
You forgot the TV. No seriously don't forget the TV.

I haven't followed TV LCD response times lately, but it probably has to do with the fact that last I checked TVs had slower LCD response time than monitors did, so the shift in colors can sort of create a natural motion blur, and thus "look better." Just like with film versus a game running at 24 or even 30 fps without any (good) motion blurring applied.

That depends on the display (motion blur is a lot stronger than what you get on any modern LCD) and even then, LCD blurring does not look very nice.
 
There is a documentary "Side by side", featuring Keanu Reeves, asking a lot of filmmakers about filmmaking and the history of it.

The reason for the low framerate in movies we are used to is nothing more than that the material (celluloid) was very expensive and the less of it was used the better for the production, every borked scene was real money lost.

Everything else, the electronis we have in our home and how movies are converted to that and the psychological effects of that framerate are just results stemming from that descision. People are used to that because it was the most costeffective way to produce movies. As technology progresses so will our behaviour sooner or later.
 
That depends on the display (motion blur is a lot stronger than what you get on any modern LCD) and even then, LCD blurring does not look very nice.
It probably only actually works when it's subtle and the game isn't THAT dependent on low LCD response times. But yeah, generally it's better to have lower refresh rates, but I imagine any decent quality TV from within the last decade will do a pretty good job there, unlike monitors from around the late 90s/early 2000s.

I do wonder if there might be some TV processing to account for too. At any rate I'm sure that whatever reason a game "looks better at 30 FPS" on console is going to be tied to a TV and when hooked up to it WILL look "better" in the same way a console does. And hopefully that's in contrast to a PS4 or at least XB1, not a PS3 or 360.
 
Oh my god, can someone stop this msdstc guy? He's literally just making shit up and then telling everyone else they're wrong.
 
It might sound crazy, but I do have a 120hz monitor, and 24fps content like movies and TV shows (as well as games at 24fps!) look a lot better on the monitor, because 24 is a multiple of 120, and every frame is shown exactly 5 frames, rather than being showed inconsistent times.

I never thought about that... I want a 120hz monitor (even more) now.
 
Top Bottom