Can we please stop with the whole "60 fps is not cinematic" argument.

It's like old b&w blurry pictures vs 40MP digital shots.

Sometimes they have a charm of their own and you don't need perfect fidelity.

That's a good point. A lot of movies are still shot on film instead of digitally because film has a certain "look" to it that is lost when shot completely digital.

This is also why musicians will put a mic up to their amp rather than record it directly into the computer because recording the amp gives the sound more warmth where as the computer's direct in loses the airyness and sounds too dry and artificial.

I prefer 30fps games because the slower frame rate can sometimes hide other flaws in the animations and pop-in.
 
I like 30 FPS in most games. Any game with people that runs at 60fps looks bizarre to me when things start moving. I wouldn't call it "soap opera" because I've seen soap operas and they don't like fake and wonky whenever something moves, unlike most games. It's an insult to soap operas to throw them in there with games.

Burnout and other games look great at 60fps so I don't blame the framerate. I think it's just poor motion capture technology or some other form of developer incompetence. Until they can make it not look crappy, I prefer 30.
 
A lot of movies are still shot on film instead of digitally because film has a certain "look" to it that is lost when shot completely digital.

While true that filmmakers choose format and every other aspect for a film to serve the film's content, there are many reasons why some people still prefer film: from romanticising about it all the way to the fear for the new medium (lighting digital and lighting film are very very different). However, what I do want to properly respond to the "certain look" claim is that without research it's quite difficult today to spot between film and digital in feature films. The outcome of the extensive process the image goes in postproduction shows today very similar results from both film and digital. That's because they all strive for one certain level of quality (while one tries to add grain for example, the other tries to remove it).

As for framerate, I'm not gonna go again into why the whole comparison between 30fps for games and 24fps for films is so limiting, wrong and also a bit ridiculous since for games it's like praising lag (I wrote something about it as my first post on gaf). It is in my opinion a problem of today's gamer perception. The gaming industry moved so much towards film, a lot of people want more to play movies than games. People start ignoring or not paying as much attention to core features of gaming and start looking for that "cinematic" feature that nowdays is required in every game. We fight so much over resolution and graphical fidelity, it doesn't matter anymore if the gameplay is just "press x for this choice, press y for this choice" and "cinematic".

Someone a few posts up said that 60fps breaks the immersion and it's a constant reminder that you're playing a game. I don't have any problem with the reminder that I'm playing a game and I'm not playing a movie (movies are not reality, are a medium you are used to and now trying to assimilate gaming to). It breaks the immersion of it not being a film and that's it. Immersion and tension are done through atmosphere and well detailed worlds, not by framerate.
 
When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game. It's a constant reminder that I'm playing a video game and breaks the immersion. Maybe that's what you're feeling.

I also hate that motion flow crap they put in TVs these days. Makes every show look like it was filmed on a consumer camcorder.

There are 4 kinds of people:

1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.

a game is a game.

when you watch a film, you are not engrossed in it that you feel you are there or part of the movie, same thing with a computer game. Sorry but anything more than 30fps breaks immersion is complete bullshit.

60fps is not for PC snobs.

60fps not only looks better, but just feels better at how smooth it is. There literally is no argument in stating 30fps is better in anyway.
 
When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game. It's a constant reminder that I'm playing a video game and breaks the immersion. Maybe that's what you're feeling.
As opposed to thinking your watching a movie? What's more immersive about that?
 
I think people who really cared about framerate should not settle for less than 120fps.
Well, there are a lot of people, myself included, that so far *have* to content themselves with 60 fps screens because a 120hz panel is a luxury they don't necessarily can afford.
But at least most of them have the decency to not make ridiculous claims about 60fps being better than 120.
 
There are 4 kinds of people:

1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.

Grow up.

With the much more accurate control from a mouse it is very easy to feel the higher input latency of lower framerates!
 
60 fps is great. 30 is playable but if the game is capable of doing 60, then by all means, make it 60. and LOL at the people who say and are convinced your eye cant see more than than 30 fps.
 
I never got the argument either. I also never got the argument that 30 fps and 60 fps look the same because they don't. But at the same time, i never got the idea that a game being 30 fps is such a deal breaker. Frankly it looks fine to me. Is it as nice? No, but it's far from the end of the world imo.
 
60fps Uncharted 3 video they released felt very strange, I don't know why.

60fps to me has always been a positive thing as far as I can remember.

That's because it was 30fps interpolated to be 60fps. it wasn't REALLY 60fps

The Hobbit in 60 or w/e its fps was hurt my eyes. It sucked. Battle scenes were too fast looking. So personally 60 fps does not give the cinematic feel for me.

that was 48fps. if you saw it in 3d, that's what it was, not the fps. Also 48 is weird because they didnt want to do 60 they just went JUST DOUBLE IT THATS GOOD RIGHT. No.
 
A framerate has absolutely nothing to do with how movie-like a piece of medium is. I can't take a jank-ass iPhone video and make it play at 24 FPS and "omg jus like lord of de rings".

Games and movies are two entirely different mediums. It's completely idiotic to believe that applying one single property of one medium to another suddenly makes the second similar to the first in any way.

30 FPS is fine. It's not ideal, but fine. You can play a game to completion and if it's a locked framerate the thought of the framerate will slip your mind unless it's a genre that truly needs it (like fighting or racing). But it isn't "cinematic". That's just a PR buzzword propagated by developers looking to spin a sub-optimal framerate as some flimsy excuse of an "artistic choice" when there isn't. Don't drink the fucking Kool-Aid. If it doesn't bother you that's fine, but you never have to partake in these sad mental gymnastics just because they convinced you.
 
Grow up.

With the much more accurate control from a mouse it is very easy to feel the higher input latency of lower framerates!

I will agree that controlling a 30fps game with a mouse is horrendous. However, I prefer using controllers and 30fps on a controller feels fine.

Also, I think that the reason 60fps is hated by some is because 60fps can be associated with arcade games and I also think 60fps shows TOO MUCH and the game loses some of it's atmosphere because of it.

When watching a frantic fight scene in a movie at 60fps, it just looks like behind the scenes footage of a couple of actors and it looks kind of like fake wresling, but when the same scene is played back at 24fps with motion blur, it can often look disorienting (on purpose) and add to the look and feel of the fight. At a slower frame rate it's harder to notice mistakes or the fact that it's fake fighting.

I'm kind of with those people who think 60fps looks too fast and animations look un-natural.

We'll see soon if the 60fps version of the Last of Us improves the game or if people find it odd looking.

Oh, one more thing about 60fps... whenever I see a game running at 60fps, I think to myself that the developers have not pushed the graphics far enough. If the engine has that much overhead, I just wonder why the developers didn't use that extra overhead to provide better physics, lighting, particle effects, animations, etc.

Those are my opinions. I know some people may feel like I'm an idiot or that I'm just conditioned to 30fps or what not, but I don't care. I know what I like and I know what looks good to me. Just like a game's color pallet can add to the mood or atmosphere, so can the frame rate.
 
I think the whole argument is from people trying to grasp at what next gen exactly is so the first thing they relate it too is PC gaming which has massive frame-rates and higher resolutions

people need to disconnect from the whole "Zomg..its not 1080p and 60 fps" and start looking at games as an experience rather than a pissing match against something that will probably always be more powerful and more versatile than a gaming console....A PC

SNK got away with using the same hardware for 15-20 years because it delivered the best fighting game experience of all the arcades/consoles (Apart from SF2)

Gaming consoles should now simply focus on what they do best which is quick and easy pick up and go gaming and next gen is allowing a more seamless integration with Friends and social aspect than ever before...the social aspect is one area where console gaming will always be king and if they improve that experience with new hardware then that is what next gen should really be about.

I used to spend many hours crowded around my tv with mates trying to beat their high score in a game or locked in a tense battle mode with supermariokart...these things i remember and it was almost 20 years ago......if a game was 30fps or 60fps or 1080p i don't remember

Having nice graphics and a buttery frame-rate are side dressings imo when it comes to console gaming
 
When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game.

Yep!
It's a constant reminder that I'm playing a video game...
Exactly! we love video games!

...and breaks the immersion.
C-C-C-C-Combo Breaker! And here was I thinking you were actually complimenting arcade-quality framerates and gameplay. Different strokes for different folks I guess.

I also hate that motion flow crap they put in TVs these days. Makes every show look like it was filmed on a consumer camcorder.

I actually hate that motion interpolation crap too but for different reasons, it looks like an odd mix of 30 and 60 fps with ugly artifacts, not to mention the added input lag.

There are 4 kinds of people:

1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.

Add a fifth: Dinosaurs like me who love arcade-quality presentation and gameplay from when they were still alive. (Arcade snobs who had to migrate to PC for their arcade fix).

I'm not allergic to 30fps, but If I can play a 60fps version of the same game I'll go for that one everytime I have the chance, or even use it as an excuse to re-play a game I've already played. (Like Strider on 360, then PS4)
 
There are two separate arguments being discussed here. One is whether 30 fps is "good enough" and worth it to be able to process twice as much to the rendered image. While I do believe the advantages of doubling the frame rate to at least 60 easily outweigh those of a better looking image, at least it is a valid choice. One that I believe should be in the hands of players rather than developers, as it is on PC, but that is another argument.

On the other hand, the argument that 30 fps is actually better than 60 fps is ridiculous. If it's simply because people are used to 24 fps movies and somehow perceive reality or quality with that experience then perhaps it needs to be re-evaluated. Then again, the comments in past threads about anisotropic filtering, sharpness, and black crush leave me to question people's perception of what is better. Either that or some people will say anything to participate in platform rivalries.

http://red.cachefly.net/learn/action-examples.zip

This download contains two examples of a dirt bike, one at 24 fps and the other at 60. Taken from this article: http://www.red.com/learn/red-101/high-frame-rate-video
 
I'm not going to argue that 30fps is better than 60fps, but I have no issues with playing a 30fps game. A steady framerate at either level is all I want or need, I really couldn't give a shit between either framerate. There are much more important things when considering the quality of a game.

The only issues I have are highly variable framerates. InFamous 2 and Dark Souls 2 come to mind.
 
It's stockholm syndrome, people have been stuck with 30fps and below so long that they're rationalizing reasons to keep it around.

Its not stockholm. Its people making excuses for games they like that don't hit 60fps. There's nothing wrong with 30fps; but trying to paint 60fps as a BAD thing in gaming is just flat out stupid. I really wish it was something GAF made bannable, but it falls juuuuuuuuuuuuust short of being stupid enough to make it on that list.


Go back 4 years and try to find threads on GAF from people claiming lower framerates are better, or that 60FPS is this bad thing thats not 'cinematic'. You won't find anything and if you do, its rare. It was born during this generation transition when certain games fell short of what gamers wanted out of their new consoles (1080p 60fps).
 
It's just good old P.R
"We want to go for a cinematic feeling" sounds better than "we can't get these graphics to run on consoles if we target 60fps".

I don't mind locked 30fps at all.
Most games that are advertised as "cinematic" tend to have a big focus on a visual spectacle, good graphics, scripted events and impressive set-pieces.
Personally I enjoy those type of games (I don't want all games to be like that but I can enjoy one every now and then) and don't see the issue with going for 30fps while pushing the graphics to a higher point than they could with 60fps.


That's what you get when you play on consoles, just accept it. Devs will ALLWAYS have to choose between graphics and performance. Some will push performance, many will push graphics and then some other try get a good balance.

I'm sure that at least half the people complaining about "60 frames should be standard on current gen consoles" would be complaining about "bad graphics" and "a minor graphical update compared to last gen" if devs would focus on locked 60fps.
 
Go back 4 years and try to find threads on GAF from people claiming lower framerates are better, or that 60FPS is this bad thing thats not 'cinematic'. You won't find anything and if you do, its rare. It was born during this generation transition when certain games fell short of what gamers wanted out of their new consoles (1080p 60fps).
Well it wasnt born this generation. First Ive heard of it was with Shadow of Collossus HD, some people agreed with the developer when they said 60fps would look messed.
 
I will agree that controlling a 30fps game with a mouse is horrendous. However, I prefer using controllers and 30fps on a controller feels fine.

Also, I think that the reason 60fps is hated by some is because 60fps can be associated with arcade games and I also think 60fps shows TOO MUCH and the game loses some of it's atmosphere because of it.

When watching a frantic fight scene in a movie at 60fps, it just looks like behind the scenes footage of a couple of actors and it looks kind of like fake wresling, but when the same scene is played back at 24fps with motion blur, it can often look disorienting (on purpose) and add to the look and feel of the fight. At a slower frame rate it's harder to notice mistakes or the fact that it's fake fighting.

I'm kind of with those people who think 60fps looks too fast and animations look un-natural.

We'll see soon if the 60fps version of the Last of Us improves the game or if people find it odd looking.

Oh, one more thing about 60fps... whenever I see a game running at 60fps, I think to myself that the developers have not pushed the graphics far enough. If the engine has that much overhead, I just wonder why the developers didn't use that extra overhead to provide better physics, lighting, particle effects, animations, etc.

Those are my opinions. I know some people may feel like I'm an idiot or that I'm just conditioned to 30fps or what not, but I don't care. I know what I like and I know what looks good to me. Just like a game's color pallet can add to the mood or atmosphere, so can the frame rate.

confirmed.

stop trying to convince everyone then that 30fps is better or anything above breaks immersion as you are massively wrong. You bizarrely prefer 30fps, we get it, but it being better in any shape or form, is wrong, not opinion but fact.
 
confirmed.

stop trying to convince everyone then that 30fps is better or anything above breaks immersion as you are massively wrong. You bizarrely prefer 30fps, we get it, but it being better in any shape or form, is wrong, not opinion but fact.
No, that's not a fact, just your opinion.
There's nothing bizarre about preferring 30fps or stating it breaks the immersion.

Best post I've read about it dates from last week:
This is pointless. The 60fps crowd will never simply accept that someone might genuinely like the look of motion blurred 30fps better in some cases.

" 30fps IS more filmic than 60fps " was NEVER proven wrong. It can't be proven wrong.

While you can't obtain a motion blur the same quality of that captured on film, 30fps+mb will always look a lot closer to what [our brain is conditioned to think] a movie looks like than 60fps.

There's no way around this.

Whether this is because our brain is conditioned to automatically classify 24fps as film and 60fps as cheap camcorder footage, that's another story, but the point still stands.

I am a videomaker, i do video and motion graphics for a living.
If i shot 48 fps footage with my camera most people have the impression it doesn't "look like a movie" for some reason (most won't be able to tell why)
Simply halving the framerate will do the trick: people will think that the same footage now "looks like film".

And when i do motion graphics in After Effects or some 3d rendering software i can export to whichever framerate i like. And i sometimes choose 25 fps with motion blur, because in a lot of cases i just prefer that look. Plain and simple.
It is, in other words, my artistic choice, and as someone who does motion graphics and often goes for 25fps i completely understand why 30fps might honestly be a purely aesthetic choice.

Of course there are benefits coming from that and from the 800p resolution, but that's beyond the point.

There are other factors working in favour of 30fps.

60fps will in fact look closer to real life.
Because of that and because of the increased fidelity, your eyes are more likely to go searching for inaccuracies and inconsistencies.
That's why watching the Hobbit, I'd often snap out of the illusion and had to try really hard to convince myself i wasn't looking at a bunch of weird guys wearing funny costumes and make up moving back and forth on a movie set.
60fps improve depth perception; everything appears sharper. The shape of things and objects onscreen becomes easier to read, 3d models acquire a much more 'solid' presence. Which is good in many ways, but also makes it much easier to pick inaccuracies, polygonal edges and flat surfaces.

To someone these might all sound like positives in a game, i'm not arguing that.
Just accept that not everyone looks for the same things.
Sometimes, 30fps help the illusion; it's a screen that helps hiding fine details that immediately tell you what you're looking at is fake and it sometimes works better for games that go for a movie look or photorealism.
 
Sorry but this break immersion argument is all crap. It is a game, no game makes you feel like you are there; the same for a movie which is a completely different medium and should not even be compared to a game in the first place. Also playing at a higher frame rate that makes the game run smoother isn't breaking anything.

60fps does not break immersion as there is no immersion to break in the first place.
 
Just because games are interactive doesn't mean that that there can't be an immersion. When you play a game, you are engaged with something.

Eg.
Playing a horror game in the dark intensifies the immersion.
Buying a driving wheel increases immersion in a racing game.
High resolution can unwillingly make a realistic game look cartoony because of low res textures.
And I believe high framerate can negatively impact the immersion for some people as well.
 
Being unable to see what's going on makes for more immersion? Okay.
Well, the idea is that games are illusions and that breaking that illusion can affect the immersion.

When you play a game like Dark souls and crank up the brightness for gameplay advantages, you see more but you break the immersion.
I'm playing Bioshock infinite (PS3) at the moment and everything seems to be deliberately covered in vaseline lighteffects, just to create an immersion.

Just like a higher framerate, a higher resolution is something that's considered to be absolutely superior as well, yet it can make 2D trees in vanilla Oblivion and pixelated bitmap skyboxes in Halo PC visible.

Ofcourse, there will be instances where higher framerate of resolution do increase the immersion. So it's an argument that goes both ways.
 
M°°nblade;114381244 said:
Well, the idea is that games are illusions and that breaking that illusion can affect the immersion.

When you play a game like Dark souls and crank up the brightness for gameplay advantages, you see more but you break the immersion.
I'm playing Bioshock infinite (PS3) at the moment and everything seems to be deliberately covered in vaseline lighteffects, just to create an immersion.

Just like a higher framerate, a higher resolution is something that's considered to be absolutely superior as well, yet it can make 2D trees in vanilla Oblivion and pixelated bitmap skyboxes in Halo PC visible.

Ofcourse, there will be instances where higher framerate of resolution do increase the immersion. So it's an argument that goes both ways.

i'm not quite sure you know what immersion means. You can immerse yourself into anything. Playing at a higher frame rate for a smoother more visually appealing game, does not break anything.

games are not illusions, why the hell are people convincing themselves this?
 
the idea that 60 fps breaks immersion doesn't make any sense to me.

It's important to note that 30 fps games will never feel as 30fps film even if you had the correct motion blur. The interaction completely changes the visual perception - we look at and react to visual information in a different way when we are interacting with it compared to just watching.

Even so though, if games had the amount of motion blur that film has it would be an unplayable blurry mess.
It doesn't make any sense to make a game look like film in terms of frame rate.
 
The cinematic argument loses all merit when games do not simulate your perspective as it were a camera. Games lack every aspect that makes movies at 24/30fps feel more cinematic. First and foremost proper motion blur, but also things like lighting and lenses. Games do not simulate a camera, and to extract the fps as the single defining thing as to what gives movies a cinematic look is some proper denial. If you remove half the frames of a soap opera, it is still going to look like a soap opera due to production quality.

As an example, the switch between 60 fps gameplay and 30 fps movies with almost the same graphic fidelity in Wolfenstein underlines this point even more for me. The only thing I notice when I'm watching the pre rendered cinematics is how choppy an inelegant they look compared to how fantastic everything else in the game flows. And the cinematic style content done real time at 60fps looks fantastic in comparison.
 
Action in low fps film has always been an awful, blurry mess. More frames per second can only help correct that.

Yes, it has NATURAL motion blur which gives that cinematic effect. Games at 30fps have no such thing. Games are not in the real world. There is nothing cinematic about lower fps in games. This whole thread just has me shaking my head in disbelief at some people's posts.
 
I can't believe this is even an argument. 30fps is a compromise, nothing else. 60fps should be the absolute standard and it's exactly why I buy all my games for PC. If 30fps was a choice then wouldn't it be locked more often on PC also? AFAIK Need for Speed: Most Wanted was the only game that did this and it was a mess on PC.
 
The cinematic argument loses all merit when games do not simulate your perspective as it were a camera. Games lack every aspect that makes movies at 24/30fps feel more cinematic. First and foremost proper motion blur, but also things like lighting and lenses. Games do not simulate a camera, and to extract the fps as the single defining thing as to what gives movies a cinematic look is some proper denial. If you remove half the frames of a soap opera, it is still going to look like a soap opera due to production

Because games have no concept of exposure time!

These statements aren't correct. Games since a few years back(HL 2 for example) do simulate exposure: it's necessary to create believable lighting and a consistent workflow for asset production(textures).

Games recently also increasingly simulate physical lenses to get effects such as chromatic aberration, bokeh patterns and so on. Of course, the one thing they can't fully simulate is the continuous stream of light present in the physical reality and thus the lack of motionblur.

I personally find this camera lens simulation to be a bit unsettling in terms of what it logically implicates. Is the entity I'm controlling supposed to be wearing a traditional camera with a lens? How does it make sense to filter the virtual world through the concept of a physical camera lens when there are other modes of perception to be explored? Even if it is only in there to look "cool" as an effect it still communicates to me that there's a virtual regular camera in the game world and in most cases that doesn't make any sense.

There is a disconnect between the desire to model a photo-realistic virtual world suited to composition with real photographs and the modeling of realistic virtual worlds that rarely is brought up. The camera is what we use to capture the physical reality on images, but it doesn't have to be that way with games.
I can't help but see it as a symptom of an inferiority complex toward the movie industry. This has the effect of movies being seen as the premiere storytelling medium with developers trying to emulate that film-like quality instead of exploring the possibilities within the medium their product actually concerns. Ultimately this leads to a negative impact when interacting with the game.

Not to mention other inconsistencies, like how the term filmic/cinematic invokes associations to styles of camera work, lighting and other issues that if implemented would be detrimental to the functioning of games. It just doesn't make any sense to compare films and games in terms of how the actual image looks on the screen in motion, when the medium itself alters our perception.
 
Stability in a game's framerate is more important than anything else, really. 30 fps doesn't bother me if it's steady. But if you can do 60 fps, you should.
 
The biggest problem with motion blur in film is that objects are blurred the same if you are focusing on an object or not. This is not the case in reality, and is why I said that low fps film is a blurry mess. I track an object on screen and it has more blur than it should... so damn annoying.
 
The question is what framerate would Metroid Prime and F-Zero be running at if you bumped the resolution up to 1080p native?

The Gamecube natively displayed games at 480p standard def at 4:3 ratio.
 
The question is what framerate would Metroid Prime and F-Zero be running at if you bumped the resolution up to 1080p native?

The Gamecube natively displayed games at 480p standard def at 4:3 ratio.
Yes, technology more than a decade old isn't up to modern standards.
 
I think the real discussion should be, what exactly do developers feel is "cinematic" or "Filmic" as opposed to what movie directors feel is.

When you have people like Amy Henning say she wanted Uncharted to play out like a 1980 Indiana Jones movie , is that type of cinematic/filmic idea really based on frame rate or is it based on look and feel of the game?

I do feel Ready at Dawn 's response is a cop out to the frame rate issue. It's obvious that they felt they could cram more graphic fidelity into 30 fps which would give them more detail thus I guess they relate that to "filmic". But given it's recent leaks, and press stating that quite frankly The Order is running very bad right now, I hope by the time it releases we will see a locked 30 frame rate on it.
 
Here is the thing I don't understand, I love every horror game and almost enjoy every one of them but the only game that I couldn't enjoy was eternal darkness at first I didn't understand why but later I found out it ran at 60 frames so I thought that was it and horror games should always be 30 frames, but I read that silent hill 3 was also 60 frames so now I'm confused because I love that game.
 
Top Bottom