The site is missing three things:
1. To allow for a fair comparison, the 30fps gif should have much better graphics.
2. The information that the visual effect of fluidity only translates to better gameplay when a couple of other criteria are met.
3. The information "Lean back, devs know better than you!"
Its the kind of difference that's hard to notice, but once you do you can't go back below 60.
OK, maybe someone can explain this to me. I have watched lots of movies over the years, and every single one of them was projected at 24 fps. None of them seemed juddery or unpleasant because of that (a lot were unpleasant for entirely unrelated reasons like plot, acting, etc. but that's another story). Why then is 30 fps in games suddenly painful to people's eyes? I understand that 17 ms/frame vs 33 ms/frame has advantages in fps, but I am only talking about visual appearance here. Am I missing something?
I play 30fps console games and 100fps+ PC games and I don't have any problems switching, it's all the performance snobs who make it out to be a massive deal.
If you wanna spend £400-500 on a GPU and same again on a 144hz monitor then knock your socks off, don't start threads and tell people they must be blind or have genetic defects if they don't give a flying fuck about it.
Same. In fact, if the game 'works' at 30FPS (something like Resogun clearly doesn't, something like TLOU does) I forget about framerate probably after five minutes of play time.I play 30fps console games and 100fps+ PC games and I don't have any problems switching, it's all the performance snobs who make it out to be a massive deal.
In cinema the DoP knows not to whip the camera around in a circle really fast. Quick angular movements are what shows the limitations of low framerates. If you watch any film moments where the camera is turning really fast you can see judder - I find it really unpleasant.
This is why a lot of the comparisons at the site kind of suck - the camera just isn't moving all that much. When it's just going forward and backwards, with really gentle looks sideways, you dont have as many egregious judder points. it needs to pan a lot to feel the difference.
In the Sleeping Dogs vid there's a moment right at the end of the loop where the camera pans 90 degrees sideways (with a weird correction, eh) - it's really clear there. If he was doing that even faster - like, say, an FPS player would - the higher framerate would be even more beneficial.
Just to make the point less clear with math, think of it like this: there's an apple in the middle of the screen, otherwise white. The camera whips 90 degrees right in 1/10th of a second. The 30fps video has 3 frames to show the movement of the apple. If the camera has a 90degree FoV, you only see the apple in the 1st frame, and maybe 1/2 of the apple in the second frame (+blur depending on shutter speed...), then nothing in the 3rd frame. The 60fps shot has 6 frames for the same movement. The relative position of the apple on each frame will be MUCH closer together, thus the movement will be perceived as smoother. The apple is on more frames.
Hope that helps.
What someone really needs to make is a site to test yourself on whether or not you can tell the difference, because I think it'd be really interesting to know how many people really can or can't.
It's not that people can't tell, people just don't care. I can tell that 30fps is not as fluid but it doesn't matter to me, that's the reason why developers aim for 30fps rather than 60. The average consumer doesn't give a toss and graphics take precedent, it's easier to hit 30 with better graphics than it is to hit 60. People would also question why a next gen console is necessary if the games look sped up like something from Benny Hill and didn't have a large enough leap in graphics fidelity.I really want this to happen. I honestly can't comprehend how anyone couldn't tell the difference between both when it is so abysmal to me (and plenty).
Can you put my post review on Yelp, I don't want people thinking I make good posts or anything.
I forgot that the latest feature of gaf is reviewing how people post, can you make sure to note my spelling and use of punctuation?
I really want this to happen. I honestly can't comprehend how anyone couldn't tell the difference between both when it is so abysmal to me (and plenty).
Has anyone pointed out that your display could make a difference in your perception between 30 & 60 fps?
I play 30fps console games and 100fps+ PC games and I don't have any problems switching, it's all the performance snobs who make it out to be a massive deal.
You should have bought cheaper GPU then.
Difference is quite obvious.
I really can't belive some people are being honest when they say they don't see a difference.
With that in mind. I don't see anything wrong with 30 fps as long as it's locked and not a 20-30 variation as in some last gen games on consoles.
Personally, if we are talking about consoles with limited hardware I preffer them going for 30 fps but trying to push other elements like graphics, effects, resolution, player/enemy count, physics, AI, etc instead of focusing on 60fps.
The difference is night and day to me. Like black and white to color.
What a dumb criticism. Is it only "resolution snobs" who prefer 1080p to 720p?
The difference is night and day to me. Like black and white to color.
:facepalm:
1080p vs 720p is something completely different than 30fps+ vs 100fps+.
And he was clearly talking abt. FPs, not resolution.
When I look at these gifs I can barely tell, but I don't play gifs. And when I play on my PC I can definitely tell the difference between 30fps and 60fps when at 1080p. Some games it matters, some it doesn't (turn based games mostly).
Also, the above goes for a 22inch monitor OR a 42 inch HDTV, both at 1080p with the same refresh rate.
Alo's post reminded me that I love this example, so Imma post it again.
I'm getting the impressions that video games boards slowly becoming like audio boards where you can read hundreds of pages of rant about the lack dynamic ranges in modern audio cd releases for example.
A lot of hyperbole and the main target groups don't care about it.
You absolutely can see the difference.
It is not jarringly different, but it is definitely ostensible and can be perceived very easily.
Plus, this is just in videos, which don't even properly convey the difference even closely.
When playing a game the difference is VERY evident. The smoothness of the movement and gameplay is immediately noticable.
The reason: Temporal anti-aliasing ("Motion Blur")I understand that 17 ms/frame vs 33 ms/frame has advantages in fps, but I am only talking about visual appearance here. Am I missing something?
Ok, so I got an example for those who may have a problem with the "half the frames removed" method of comparison.
The following clips have two different master files. They are natively captured at their respective framerates. You can see each one starts and ends slightly differently.
http://a.pomf.se/zrvgrn.webm
http://a.pomf.se/ychkrc.webm
Can't really get more obvious than this.
Ok, so I got an example for those who may have a problem with the "half the frames removed" method of comparison.
The following clips have two different master files. They are natively captured at their respective framerates. You can see each one starts and ends slightly differently.
http://a.pomf.se/zrvgrn.webm
http://a.pomf.se/ychkrc.webm
Can't really get more obvious than this.
I can tell the difference, but I really don't like the "look" of 60 fps in some situations. It's like those HFR movies. It just looks odd to me. Maybe I actually do prefer the "cinematic" look of lower fps?
Those are truly two different gameplays recorded in different sessions? How did they turn out so identical in a fighting game like that?
They won't load.