Someone on Reddit made a 30fps vs 60fps site.

That is absurd. The purpose of this comparison is "video game running at 30fps vs the same video game running at 60fps," there is nothing console specific about it, it is a comparison not an example. Also, I'd say that barring a few VERY specific cases (Text based, games with no real time action at all, and extremely slow paced games,) 60fps is universally the better tradeoff to make. In my opinion performance of a game takes priority over literally any other technical aspect, including AA, resolution, and effects, it is absurd to pretend that 60fps doesn't have a big impact on most games.

I agree it has a huge impact. On consoles it makes games look shittier for a very marginally noticeable gain in smoothness.

For PCs I agree, you don't need to make the tradeoff. For consoles I strongly disagree. Perhaps you haven't seen any thread involving a new console game announcement these days. It invariably gets derailed by the same people whining about 60fps. At the end of the day the tradeoff in a fixed spec is definitely not worth it in most cases. If you want to game on consoles and have some physco-somatic disorder that makes you ultra sensitive to framerates, you're definitely going to be in the minority, #dealwithit.
 
It is the difference between a slideshow and a film. I don't get how people don't see the difference.
Didn't check the site btw, except this.

I agree it has a huge impact. On consoles it makes games look shittier for a very marginally noticeable gain in smoothness.

For PCs I agree, you don't need to make the tradeoff. For consoles I strongly disagree. Perhaps you haven't seen any thread involving a new console game announcement these days. It invariably gets derailed by the same people whining about 60fps. At the end of the day the tradeoff in a fixed spec is definitely not worth it in most cases. If you want to game on consoles and have some physco-somatic disorder that makes you ultra sensitive to framerates, you're definitely going to be in the minority, #dealwithit.

There are games on consoles like those character action games that must be running on a solid and high framerate because it affects gameplay.
Also competitive online games must have a solid framerate too.
It has nothing to do with whether you play on PC or consoles. Also console games had 60 fps since forever, it's only last gen that killed framerate in most games to satiate the 'graphix fanz'.
 
I can see the difference but I honestly just can't stand 60fps. It always looks contrived and I end up with headaches afterwards.

Your (very real) Gaf cred has just taken a plummet; only comparable to the market crash of May of 2012.
 
It's kinda like "Do you want original japanese voice-overs in your anime/movie or do you prefer the english dub?"

Well the answer to that often fall down to what you are used to. If I watched a good anime in japanese first, then I would hate suddenly being subjected to the english dub version. And it would also feel weird to make the jump the other way as well.

I don't think this example is very good. When anime becomes localized they also change the meaning of words and phrases to make it match the expected american audience. I think certain shows also get localized better than others.
 
I can see a difference but I have to really pay attention. Personally if I have to focus that much to see a difference, it doesn't really bother me.

I'm all for 60fps in games but I don't personally understand people that find 30fps to be unplayable. I'm just glad I'm not one of those people, it would ruin a huge section of gaming. That said though just because some people are more sensitive to framerate differences than others doesn't mean they are superior gaming specimens or that people that can't notice a difference are what's wrong with the gaming industry.
 
60fps does feel nice. But man it does give me sitcom vibes at the same time.

Viewing it next to 30 fps, yes it does.

But the thing is, if you aren't comparing 60 fps to 30 side by side, but only play a game with 60 fps by itself. It feels better (imo). It makes the game crisper and clearer becase there's less stutter when you are in motion in the game, which makes every detail in the game world easier to see, which makes the game more immersive and impressive. And the game feels smoother and better to play (depending on the game).

I don't think this example is very good. When anime becomes localized they also change the meaning of words and phrases to make it match the expected american audience. I think certain shows also get localized better than others.

Hehe, fair enough. Though that's a bit more "meta" than I was aiming for with my example. I guess it makes more sense if you assume a perfect localization (if such a thing existed).
 
The people who cant tell the difference here must either be trolling or have some retinal disorder, the difference is huge
How one can't see a difference is beyond me
I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.
Frame rate detection capability (critical flicker fusion) varies a lot, especially with screen brightness, field of view and periphery
On a 25cd/m^2 screen you may not even be able to see the difference between 15 and 30 Hz, besides strobing artifacts.
kalltemp811krn.jpg

cffo7kfi.png

(There are no disorders I'm aware of that have a major difference. Epilepsy and migrane patients have a slightly lower flicker fusion rate due to their longer cortical silent periods, but nothing major)
 
The site has bad examples.

Check this example out.

http://a.pomf.se/fiblow.webm

Obviously I can see the difference , I have a 7970 and play most my games at 60. I'm also perfectly happy to play well below 60 and ramp up the graphics settings.

None of the clips shown will make anyone understand why some Gafters are passing out / projectile vomiting / having to lie in a dark room for a week due to headaches brought on by 30fps. No one is watching them and going -

" oh that's why every thread gets turned into a shitty resolution vs frame rate war"

Edit : Playing *some* genres at 60 would of course be a totally different story.

You are really, really bad at math. If 30fps is just fine, the difference between 30 and 60 cannot possibly be minuscule. 100% != 15%. The difference is not even close to minuscule.
minuscule
ˈmɪnəskjuːl/
adjective
adjective: miniscule

1.
extremely small; tiny.

The fuck has it got to do with math.
 
Viewing it next to 30 fps, yes it does.

But the thing is, if you aren't comparing 60 fps to 30 side by side, but only play a game with 60 fps by itself. It feels better (imo). It makes the game crisper and clearer becase there's less stutter when you are in motion in the game, which makes every detail in the game world easier to see, which makes the game more immersive and impressive. And the game feels smoother and better to play (depending on the game).

It does it does.

I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.
Frame rate detection capability (critical flicker fusion) varies a lot, especially with screen brightness, field of view and periphery
On a 25cd/m^2 screen you may note even be able to see the difference between 15 and 30 Hz, besides strobing artifacts.
kalltemp811krn.jpg

cffo7kfi.png

(There are no disorders I'm aware of that have a major difference. Epilepsy and migrane patients have a slightly lower flicker fusion rate due to their longer cortical silent periods, but nothing major)

... I don't understand this. Can you explain it a bit more?
 
Gah. Most browser/PC combinations can't handle 60FPS so this website is worthless and only serves to further confuse the uneducated on this topic.
 
Gameplay 60fps.
In-game cutscenes 30fps.
These should be the benchmark.

I don't want to a have a cinematic look when I'm playing the game, that's what the cutscenes are for. I want the gameplay to be as responsive as can be.
 
Do what it says. First, look at the left image for a few loops.

Then switch to the right image.

If you're constantly going back and forth the difference is mostly lost. If you compare one after the other, the differences are stark.

So, I did that, and there is a big difference. BUT.. is it wrong to like the look of the 30fps+ more? At least in all the driving stuff. Looks.. idk, better.
 
It is the difference between a slideshow and a film. I don't get how people don't see the difference.
Didn't check the site btw, except this.



There are games on consoles like those character action games that must be running on a solid and high framerate because it affects gameplay.
Also competitive online games must have a solid framerate too.
It has nothing to do with whether you play on PC or consoles. Also console games had 60 fps since forever, it's only last gen that killed framerate in most games to satiate the 'graphix fanz'.

I agree. My original post did say that Online FPSs, fighters and racing sims all do better at 60fps. But whining about for example, an open world game like Watch Dogs at 30fps is extremely stupid. Look at MGS Ground Zeros. The game ran at 60fps for some strange inexplicable reason and ended up being nearly universally panned for its visuals.
 
Left looks cinematic, Right plays just great.

I cant make fine adjustments to control of say, a snowboard, a rally car, etc, at 30fps, it feels like I'm doing it in slow motion, (Which I actually am!)...
 
I can easily tell 30 and 60 apart while playing but this site is giving me a hard time with those examples (or my browser is fucking them up?... only the dirt one loads)
 
Gameplay 60fps.
In-game cutscenes 30fps.
These should be the benchmark.

I don't want to a have a cinematic look when I'm playing the game, that's what the cutscenes are for. I want the gameplay to be as responsive as can be.

Pretty much. Cool site though, I'll visit again when there's more videos.
 
I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.
Frame rate detection capability (critical flicker fusion) varies a lot, especially with screen brightness, field of view and periphery
On a 25cd/m^2 screen you may not even be able to see the difference between 15 and 30 Hz, besides strobing artifacts.
kalltemp811krn.jpg

cffo7kfi.png

(There are no disorders I'm aware of that have a major difference. Epilepsy and migrane patients have a slightly lower flicker fusion rate due to their longer cortical silent periods, but nothing major)

yeah-science-bitch-meme.jpg


No idea what it means tho.
 
I didn't simply state that it's subjective. Read the whole post. It contains everything element you would ever need on this topic. :D

Very little objectivity is needed in this discussion.

I agree it has a huge impact. On consoles it makes games look shittier for a very marginally noticeable gain in smoothness.

For PCs I agree, you don't need to make the tradeoff. For consoles I strongly disagree. Perhaps you haven't seen any thread involving a new console game announcement these days. It invariably gets derailed by the same people whining about 60fps. At the end of the day the tradeoff in a fixed spec is definitely not worth it in most cases. If you want to game on consoles and have some physco-somatic disorder that makes you ultra sensitive to framerates, you're definitely going to be in the minority, #dealwithit.

Holy shit those are some of the most ignorant things I have read all day. Not even worth responding to.

minuscule
ˈmɪnəskjuːl/
adjective
adjective: miniscule

1.
extremely small; tiny.

The fuck has it got to do with math.

Double cannot possible equal extremely tiny.
 
I honestly preferred the 30fps versions, when placed side-by-side like that. In such a direct comparison 30fps just makes the action look like it was captured on film rather than video.

The 60fps versions remind me why I never turn on the 100hz tech in my TV for movies and why I disliked watching The Hobbit in 48fps HDR: At 48fps Hobbit movies looked like they are being 'videoed on a set' rather than 'filmed on location'.

60fps is probably best for things that need to look 'live', ex. sports games.
 
I honestly preferred the 30fps versions, when placed side-by-side like that. In such a direct comparison 30fps just makes the action look like it was captured on film rather than video.

The 60fps versions remind me why I never turn on the 100hz tech in my TV for movies and why I disliked watching The Hobbit in 48fps HDR: At 48fps Hobbit movies looked like they are being 'videoed on a set' rather than 'filmed on location'.

60fps is probably best for things that need to look 'live', ex. sports games.

Exactly how I feel.
 
Very little objectivity is needed in this discussion.



Holy shit those are some of the most ignorant things I have read all day. Not even worth responding to.

The objectivity is needed because of the kind of post that you just labeled as "ignorant". :D
Those are the people I'm trying to get through to, hehe. Because I agree with you. They are.
 
I honestly preferred the 30fps versions, when placed side-by-side like that. In such a direct comparison 30fps just makes the action look like it was captured on film rather than video.

The 60fps versions remind me why I never turn on the 100hz tech in my TV for movies and why I disliked watching The Hobbit in 48fps HDR: At 48fps Hobbit movies looked like they are being 'videoed on a set' rather than 'filmed on location'.

60fps is probably best for things that need to look 'live', ex. sports games.

That seems utterly absurd to me, but whatever you prefer I suppose. However, it isn't just about visuals - 60fps gives you more responsive gameplay. It's the best thing for any game that benefits from more responsive gameplay.
 
I have to agree that the difference on this site isn't so clear though the WebM's show that difference spectacularly well. I am thinking it likely because of the intense compression on these videos.

Hopefully they update with better videos or WebM :)
 
No idea what it means tho.
It does it does.
... I don't understand this. Can you explain it a bit more?
I can try.

TL;DR:
On the x axis, you have brightness of the thing you're viewing.
On the y axis, you have the minimum frame rate that appears "perfectly smooth" (CFF)
The different lines show the dependence on other factors besides brightness.

More detail, for those interested:
1. graph. FoV fixed at 19° viewed strait on, shows color dependence in low light situations
2. graph. Graphs for 0.05°, 0.5° and 5,7° FoV, viewed strait on looking 35 degree to the side (peripheral vision)

Math (CFF model):
f(E,L,d, p) = (0.24E + 10.5)(Log L+log p + 1.39 Log d - 0.0426E + 1.09) (Hz)
f = CFF
E = eccentricity in degrees
L = eye luminance in Troland,
d = stimulus diameter in degree
p = pupil area in mm2

Oculus VR, worst case example:
Dark adapted pupil at 1cd/m2: 13mm2, log p = 1.1
215cd/m2 screen = 3.45 log Td
110 degree VR HMD full white, ultra low persistence flicker

centered eye: CFF = 89 Hz
35 degree right/left: CFF = 127 Hz

That's why Oculus is targeting 90+ Hz for their consumer model.
 
I have to agree with another poster here, 60fps looks more "cartoonish" (I don't know how to describe it, it just looks too smooth?) than the 30fps. 30fps has a cinematic feel to it. But 60fps is definitely smoother and I would take that in shooters especially.

I'd wager this mindset is similar to the Robotics argument of "Uncanny Valley." You know the game should move closer to real life (with 60 being better than 30) but the visuals you are looking at are nowhere near that life-like quality so it appears "odd". I think this will be the case for a long time.
I am curious if you asked a person who had never experienced movies and was just introduced to games which one they would natively prefer. Obviously a difficult thing to test but I have a feeling our bias on cinema means we are "comfortable" with the 30 and thus "prefer" it.
I see this much like loving our mom's cooking as "the best." Your mother probably over-salted, under seasoned, or simply lacked some skill... but it was hers and she made it for you. Technically superior soups could never stack up to the comfort food she made for you when you were sick.
30FPS is your mothers "sick soup." If it makes you feel good, who cares.
 
Top Bottom