EightBitNate
Member
I can barely get my life past 15FPS. Should I try upgrading my video card or downloading more RAM for myself?
It has only become passable because of motion blur. You can't have filmic look without motion blur.
Movies can have perfect motion blur. It means that not a single frame is a still frame. Every frame is a delta of many frames. So you could say (almost) that there are much more than 24 frames per second of information in there. A frame contains movement.
That's not possible in a realtime engine unless you render everything with a hefty (and horrible) delay.
That's why games NEED higher fps than movies. Case closed.
Leave this thread, watch this GameSpot video, then come back.
In short one of the biggest differences is motion blur for movies sort of smoothing it out for our brains, video games going for 24 fps without faking the motion blur, well, watch that Watch_Dogs webm, and there's Ocarina of Time on the N64 for I THINK 20 fps.
Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.
Just quoting some people that get it.The way that films are shot, edited, and touched up is completely different from the way games are rendered. Even motion blur was traditionally achieved in movies from the exposure of films. The blur appeared as a natural product of object or scene motion captured into one frame. Blur in games fundamentally doesn't work that way. It's simulated, just like everything else, and at this current generation the blur isn't anywhere near as natural looking as it does in film.
You want a game to "look cinematic", I expect some serious efforts in the technical side of actually making a game look cinematic. I expect pristine, clean image quality, blur and depth of field that looks indistinguishable form its film counterparts, and a complete lack of loading screens so the story misses no beats.
Oh but that's not what we're getting with these "cinematic" games, are we? We still get shitty blur, jaggies out the ass, poorly segmented sequences or excessive loading times that, if they ever appeared in any movie, people would just walk out the theater.
"My game is cinematic because it runs at 30 FPS" is the biggest load of horseshit I've ever heard to justify the framerate. That's not how a film works. I can't take a fucking iPhone camera, chop everything to look 24 FPS, then claim a video of my dog skateboarding "makes it look cinematic". Any developer making such a boneheaded argument is blowing smoke up your ass.
EDIT: I want to make it clear I think 30 FPS is still perfectly playable, but it's objectively inferior to 60 FPS in any game. 30 FPS has never been, and never will be, a conscious artistic decision by any developer. It is always, always a technical one that results in the developer trading off framerate for better visuals.
His post insinuated COD=60fps and sales great, therefore 60FPS= better sales
I flipped it with Halo= 30fps therefore fps doesn't matter.
His post insinuated COD=60fps and sales great, therefore 60FPS= better sales
I flipped it with Halo= 30fps therefore fps doesn't matter.
I can barely get my life past 15FPS. Should I try upgrading my video card or downloading more RAM for myself?
I can argue that, but wouldn't you agree the majority of us grew up with 24 as the standard? So in the eyes of a good chunk of people including developers, 30fps would have a more cinematic feel. It's a simple concept to understand, I don't see why it's so laughable at all.
60fps looks strange in movies because there are so few that use it. No other reason.
Oh sure suuuure.
Nothing to do with 24fps looking like real life. No noooo, nothing at all.
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.
Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why doen't we see any half-step compromises like a locked 40 or 50 or what have you?
I don't think thats how that works.
Which halo are you talking about runs at 30?
Maybe I am wrong, haven't played halo in a bit, but the newer ones or new-ish ones run at 60
It's about the feel of 60 fps. It's about the feel of 24 fps. It is more like real life. I don't understand any of this, and it is very difficult to not think of these terms as people just trolling other people. What exactly is this "feel" of 24 frame rates per second? Is the feel of 24 fps is similar to 30 fps?
I did not want to post in this thread, but I think it is ridiculous that people are attempting to defend that 24 fps is good because it has the feel of 30 or 60 fps, and it is similar to real life.
Which is the bigger frachise?
Also everyone was talking about how CoD needs 60 fps to retain its gameplay feel and input.
Do you want to show evidence of anyone besides you talking about sales?
Yeah, I didn't insinuate that.
What I said is that 60 FPS matters to some developers to deliver the experience that they want to deliver, part of a package that includes exceptional responsiveness during multiplayer matches, in part delivered by the framerate. And it's noticeable even by the layperson who doesn't know the technical details.
I never said that a high framerate directly leads to better sales in a vacuum.
Oh, I definitely agree that this explains the phenomena, I'm just not sure I'd consider it a particularly good justification. "It's what I'm used to" isn't likely to convince many people who disagree of the validity of your position.
If you're just saying "I grew up with this and it's what I'm used to so I prefer it," then sure, that's absolutely fine.
TV (which I think runs at 50Hz).
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.
Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?
60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.
Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?
Here is Dark Souls II rendered, recorded, and played at 24fps.
It is a considerably different looking experience than yours.
I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
The problem I have with these webms posted is that the game itself isn't designed to run at that low a framerate, so it will look worse than one which was intended to. Motion blur and the like will be finely tuned around that target.
Needs motion blur.
Think you should redo this. Doesn't look like you're using motion blur either. I think it would help at a lower frame rate. I locked mine at 24 and it didn't seem that choppy. I even ran around, drove around, shot the place up. Was decent enough, not really playable for a whole though.
The video even hitches around 11-12 seconds.
Yes. But in your example there is no motion blur.
This.. people ways forget that 24fps only looks good cause of motionblur. Take that out movies would look like shit.
I can barely get my life past 15FPS. Should I try upgrading my video card or downloading more RAM for myself?
It's about the feel of 60 fps. It's about the feel of 24 fps. It is more like real life. I don't understand any of this, and it is very difficult to not think of these terms as people just trolling other people. What exactly is this "feel" of 24 frame rates per second? Is the feel of 24 fps is similar to 30 fps?
I did not want to post in this thread, but I think it is ridiculous that people are attempting to defend that 24 fps is good because it has the feel of 30 or 60 fps, and it is similar to real life.
I think it has to do with how much the tvs refresh per second. A normal one refreshes 60 times per second (60Hz) so games with 60 fps show exactly 60 different photograms per second while 30 fps games show just 30 of them.60 is preferable to 30, but only if those numbers are solid and stable. I'd trade "usually 60fps" for a rock-solid locked 30fps in a second.
Somewhat on that topic, why are 60 and 30 the only benchmarks? If a game can't achieve 60fps, why is 30 the default option for a next step down? Why don't we see any half-step compromises like a locked 40 or 50 or what have you?
Guess what, video games aren't going to employ those film techniques either, so that's a moot point.Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.
Televisions and Monitors typically refresh at 60Hz. When running at 45 fps some frames will be displayed longer than others causing judder or a frame will change while it's being displayed causing tearing. Gsync is meant to dynamically adjust refresh rates to match the framerate to eliminate the tearing and judder at those in between framerates.
This is bullshit. Please do not post this misinformed stuff, it furthers incorrect biases on gaf.
Here is Dark Souls II rendered, recorded, and played at 24fps.
It is a considerably different looking experience than yours.
I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
Yeah "bullshit"
![]()
I'd have zero problems if a game was 24fps, as long as there was an artistic reason behind it. Replicating the "filmic" experience would count in my opinion.
To me it's no different than making a game black and white, with pixelart graphics, 2.4:1 aspect ratio, etc. The artistic vision shouldn't be restrained just because it decreases the image quality.
If however a game just ran bad and hovered at an unlocked 24fps, then that'd rightfully deserve all the hate it'd get.
After you use SVP for all of your videos like I do, 24 fps will feel like laggy shit. Plus there was an article somewhere where the people said that 72 fps induced the most emotion to the audience.Like, real life movement feel natural. It just happens. You don't notice a framerate.
24fps movies are like that. When people move it feels like how my eyes see real people.
60fps is more like "look at me I'm sooo smooth!" Feels artificial
The way that films are shot, edited, and touched up is completely different from the way games are rendered. Even motion blur was traditionally achieved in movies from the exposure of films. The blur appeared as a natural product of object or scene motion captured into one frame. Blur in games fundamentally doesn't work that way. It's simulated, just like everything else, and at this current generation the blur isn't anywhere near as natural looking as it does in film.
You want a game to "look cinematic", I expect some serious efforts in the technical side of actually making a game look cinematic. I expect pristine, clean image quality, blur and depth of field that looks indistinguishable form its film counterparts, and a complete lack of loading screens so the story misses no beats.
Oh but that's not what we're getting with these "cinematic" games, are we? We still get shitty blur, jaggies out the ass, poorly segmented sequences or excessive loading times that, if they ever appeared in any movie, people would just walk out the theater.
"My game is cinematic because it runs at 30 FPS" is the biggest load of horseshit I've ever heard to justify the framerate. That's not how a film works. I can't take a fucking iPhone camera, chop everything to look 24 FPS, then claim a video of my dog skateboarding "makes it look cinematic". Any developer making such a boneheaded argument is blowing smoke up your ass.
EDIT: I want to make it clear I think 30 FPS is still perfectly playable, but it's objectively inferior to 60 FPS in any game. 30 FPS has never been, and never will be, a conscious artistic decision by any developer. It is always, always a technical one that results in the developer trading off framerate for better visuals.
I'm looking and can't find a 60 vs 24fps movie/tv show clip. Are there any out there?
To me a movie/tv show looks pretty terrible at 60fps (I see the soap opera/fast forward/cheap cam effect) whereas a game is better at 60fps (though I'm ok with a 30fps game and can't visually see the difference).
Darksouls has a perobject motionblur. It appears that Watchdogs really does not.
There are google searches for "how is babby formed?"
you point is?
Guess what, video games aren't going to employ those film techniques either, so that's a moot point.
How many fps does real life run in?
60 fps only looks strange in movies because it is uncommon. That's literally it.
That 30fps feels smoother on consoles than pc.
Darksouls has a perobject motionblur. It appears that Watchdogs really does not.
There are google searches for "how is babby formed?"
you point is?
Here is Dark Souls II rendered, recorded, and played at 24fps.
It is a considerably different looking experience than yours.
I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
It'll happen.I just hope nobody ever argues for 60 FPS films. That shit looks horrid.
30fps is more smooth on consoles than pc. As long as it's locked, 30fps is perfectly fine.
That 30fps feels smoother on consoles than pc.
Obviously, we can calculate our brain's fps through frame times. You simply divide 1 second (or 1000ms) by the time it takes for our brain to render a frame--i.e. our brain's "frame latency" (about 100ms). 1000/100 = 10, so, we see at 10fps.
I don't see what all you frame rate junkies are on about; 30fps, hell, even 24fps is more than enough.
Yes, I am joking
Darksouls has a perobject motionblur. It appears that Watchdogs really does not.=
Motion blur is a standard visual effect in videogames. Done correctly, it drastically improves the apparent fluidity of motion.
TVS. AND. MONITORS. THE DIFFERENCE IS IN THE DISPLAY NOT THE SYSTEM.Yeah "bullshit"
That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.Here is Dark Souls II rendered, recorded, and played at 24fps.
It is a considerably different looking experience than yours.
I don't know if it's something to do with how you recorded or played it (I'm guessing maybe played and recorded at 60fps then rendered the video out at 24fps) but I don't think yours is an accurate representation of playing at 24fps at all.
That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.
I think it is partly a artistic choice. As much as huge parts of their audience, they don't know it better.
The soap opera look is due to lightning, for fucks sake.