It is a subjective debate. Stating that it is subjective changes literally nothing.
I didn't simply state that it's subjective. Read the whole post. It contains everything element you would ever need on this topic.

It is a subjective debate. Stating that it is subjective changes literally nothing.
That is absurd. The purpose of this comparison is "video game running at 30fps vs the same video game running at 60fps," there is nothing console specific about it, it is a comparison not an example. Also, I'd say that barring a few VERY specific cases (Text based, games with no real time action at all, and extremely slow paced games,) 60fps is universally the better tradeoff to make. In my opinion performance of a game takes priority over literally any other technical aspect, including AA, resolution, and effects, it is absurd to pretend that 60fps doesn't have a big impact on most games.
I agree it has a huge impact. On consoles it makes games look shittier for a very marginally noticeable gain in smoothness.
For PCs I agree, you don't need to make the tradeoff. For consoles I strongly disagree. Perhaps you haven't seen any thread involving a new console game announcement these days. It invariably gets derailed by the same people whining about 60fps. At the end of the day the tradeoff in a fixed spec is definitely not worth it in most cases. If you want to game on consoles and have some physco-somatic disorder that makes you ultra sensitive to framerates, you're definitely going to be in the minority, #dealwithit.
I see a huge difference in the fluidity of the image.
I can see the difference but I honestly just can't stand 60fps. It always looks contrived and I end up with headaches afterwards.
It's kinda like "Do you want original japanese voice-overs in your anime/movie or do you prefer the english dub?"
Well the answer to that often fall down to what you are used to. If I watched a good anime in japanese first, then I would hate suddenly being subjected to the english dub version. And it would also feel weird to make the jump the other way as well.
60fps does feel nice. But man it does give me sitcom vibes at the same time.
I don't think this example is very good. When anime becomes localized they also change the meaning of words and phrases to make it match the expected american audience. I think certain shows also get localized better than others.
The people who cant tell the difference here must either be trolling or have some retinal disorder, the difference is huge
I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.How one can't see a difference is beyond me
minusculeYou are really, really bad at math. If 30fps is just fine, the difference between 30 and 60 cannot possibly be minuscule. 100% != 15%. The difference is not even close to minuscule.
There's no way you can tell the difference just looking at it, but if you have a controller in your hands you can tell pretty easily.
Viewing it next to 30 fps, yes it does.
But the thing is, if you aren't comparing 60 fps to 30 side by side, but only play a game with 60 fps by itself. It feels better (imo). It makes the game crisper and clearer becase there's less stutter when you are in motion in the game, which makes every detail in the game world easier to see, which makes the game more immersive and impressive. And the game feels smoother and better to play (depending on the game).
I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.
Frame rate detection capability (critical flicker fusion) varies a lot, especially with screen brightness, field of view and periphery
On a 25cd/m^2 screen you may note even be able to see the difference between 15 and 30 Hz, besides strobing artifacts.
![]()
![]()
(There are no disorders I'm aware of that have a major difference. Epilepsy and migrane patients have a slightly lower flicker fusion rate due to their longer cortical silent periods, but nothing major)
Do what it says. First, look at the left image for a few loops.
Then switch to the right image.
If you're constantly going back and forth the difference is mostly lost. If you compare one after the other, the differences are stark.
Gah. Most browser/PC combinations can't handle 60FPS so this website is worthless and only serves to further confuse the uneducated on this topic.
It is the difference between a slideshow and a film. I don't get how people don't see the difference.
Didn't check the site btw, except this.
There are games on consoles like those character action games that must be running on a solid and high framerate because it affects gameplay.
Also competitive online games must have a solid framerate too.
It has nothing to do with whether you play on PC or consoles. Also console games had 60 fps since forever, it's only last gen that killed framerate in most games to satiate the 'graphix fanz'.
Gameplay 60fps.
In-game cutscenes 30fps.
These should be the benchmark.
I don't want to a have a cinematic look when I'm playing the game, that's what the cutscenes are for. I want the gameplay to be as responsive as can be.
I hope people would stop calling people "trolls" or "blind" when they don't know what they are talking about.
Frame rate detection capability (critical flicker fusion) varies a lot, especially with screen brightness, field of view and periphery
On a 25cd/m^2 screen you may not even be able to see the difference between 15 and 30 Hz, besides strobing artifacts.
![]()
![]()
(There are no disorders I'm aware of that have a major difference. Epilepsy and migrane patients have a slightly lower flicker fusion rate due to their longer cortical silent periods, but nothing major)
I don't see the difference.
I didn't simply state that it's subjective. Read the whole post. It contains everything element you would ever need on this topic.![]()
I agree it has a huge impact. On consoles it makes games look shittier for a very marginally noticeable gain in smoothness.
For PCs I agree, you don't need to make the tradeoff. For consoles I strongly disagree. Perhaps you haven't seen any thread involving a new console game announcement these days. It invariably gets derailed by the same people whining about 60fps. At the end of the day the tradeoff in a fixed spec is definitely not worth it in most cases. If you want to game on consoles and have some physco-somatic disorder that makes you ultra sensitive to framerates, you're definitely going to be in the minority, #dealwithit.
minuscule
ˈmɪnəskjuːl/
adjective
adjective: miniscule
1.
extremely small; tiny.
The fuck has it got to do with math.
I see a huge difference in the fluidity of the image.
Not from those images.
I honestly preferred the 30fps versions, when placed side-by-side like that. In such a direct comparison 30fps just makes the action look like it was captured on film rather than video.
The 60fps versions remind me why I never turn on the 100hz tech in my TV for movies and why I disliked watching The Hobbit in 48fps HDR: At 48fps Hobbit movies looked like they are being 'videoed on a set' rather than 'filmed on location'.
60fps is probably best for things that need to look 'live', ex. sports games.
60fps is probably best for things that need to look 'live', ex. sports games.
Very little objectivity is needed in this discussion.
Holy shit those are some of the most ignorant things I have read all day. Not even worth responding to.
I honestly preferred the 30fps versions, when placed side-by-side like that. In such a direct comparison 30fps just makes the action look like it was captured on film rather than video.
The 60fps versions remind me why I never turn on the 100hz tech in my TV for movies and why I disliked watching The Hobbit in 48fps HDR: At 48fps Hobbit movies looked like they are being 'videoed on a set' rather than 'filmed on location'.
60fps is probably best for things that need to look 'live', ex. sports games.
No idea what it means tho.
I can try.It does it does.
... I don't understand this. Can you explain it a bit more?
f(E,L,d, p) = (0.24E + 10.5)(Log L+log p + 1.39 Log d - 0.0426E + 1.09) (Hz)
f = CFF
E = eccentricity in degrees
L = eye luminance in Troland,
d = stimulus diameter in degree
p = pupil area in mm2
I have to agree with another poster here, 60fps looks more "cartoonish" (I don't know how to describe it, it just looks too smooth?) than the 30fps. 30fps has a cinematic feel to it. But 60fps is definitely smoother and I would take that in shooters especially.