Someone on Reddit made a 30fps vs 60fps site.

Okay the Counter strike example was a lot easier for me to tell. Meh, depends on the genre, but still not a big deal to me. I still play Mario 64
 
What am I reading here? Some of you guys crack me up.

Yeah, it is insane how some people lack a grasp of simple logic (the 60 FPS video you are watching is part of "real life" and thus you can not claim that it does not look natural because it has a higher framerate than "real life"!).
 
I cry when people can't see the difference. It's massive and 30fps is just so bad in comparison. I just finished Dark Souls 2 on PC and I can't imagine having to grind through that at half the frame rate. I wouldn't have bothered. Some things are alright at 30, but it's not desirable in the least.
If they can't see the difference, they wouldn't care, which means you shouldnt :P

In all seriousness, I can see the difference on that page but it is so slight I doubt I'd even give a shit if I was playing one or the other. Guess some people are more fussy about it than others.
 
Yeah, it is insane how some people lack a grasp of simple logic (the 60 FPS video you are watching is part of "real life" and thus you can not claim that it does not look natural because it has a higher framerate than "real life"!).

It reminds me of those sunglasses advertised as "hd sunglasses"...they make your vision higher definition than reality!
 
None of the examples in this thread communicate the difference quite like playing a game that runs at stable 60fps for minutes and hours, not a few seconds of a gameplay clip. When you're in control of the camera and the movement and the aim and the throttle, the differences become clearer than any side-by-side of gifs and webm videos can show. Watching is never like playing because your brain and hands aren't being used to control it, to key off of those frames of visual data at high speeds, in very small slices of time...the more you get in same time slice means you're able to recognize more shapes, movement far more clearly and then react more appropriately and faster. 30fps is only okay because we don't get to choose with console games that have no PC version, but any game that requires any decent hand-eye coordination and your eyes to stare at it for lengths of time always benefits from 60fps and especially on fixed-panel displays that lack all of the motion resolution that CRT had. Nothing more annoying than moving from 60fps to 30fps right after another, but it's worth the experience just to know what's going on from the player's perspective.
 
I can easily tell 30 versus 60 while playing but these videos do a really poor job at comparing. The bitrate is so low that the added clarity of 60fps is completely lost.
 
I could barely tell the difference. I think it's hilarious that someone actually made this site, though. FPS has never mattered to me unless it was completely awful like in Saints Row 4. Luckily the game was fun as fuck, so I trucked through it.
 
Sorry but the examples on that website hardly show any difference.

Some of the elitist 60-fps camp on GAF are the biggest DBs around. Why do they care if other people can't tell the difference.
 
I can't see a difference and that's one reason I enjoy games much more than most people on this forum.

What does this even mean? So because I can tell a difference, I don't enjoy video games as much as you?

Wow. So full of yourself.

I really don't mind if a game is 30 FPS or 60 FPS...Do I prefer 60? Hell yes I do, but that doesn't mean that because a game is 30 FPS I won't enjoy it...

Off the top of my head I can't even think of any 60 FPS games last gen that I enjoyed and really the only ones that were 60 FPS were Call of Duty games and racing simulators which are decent games but not my cup of tea. I prefer story driven single player games =)
 
Sorry but the examples on that website hardly show any difference.

Some of the elitist 60-fps camp on GAF are the biggest DBs around. Why do they care if other people can't tell the difference.

Idk 60fps just seems obviously smoother to me...

The effect is more noticeable in FPS games, where the differences between frames are generally much larger (check the BF4 clips and look at the corners of the frame, where you'll find the biggest deltas). The question to ask, imo, is if you alternated between 60 and 30fps every 2 seconds (that is, if the game stuttered between 30fps and 60fps) do you think there would be no noticeable difference?
 
Idk 60fps just seems obviously smoother to me...

The effect is more noticeable in FPS games, where the differences between frames are generally much larger (check the BF4 clips and look at the corners of the frame, where you'll find the biggest deltas). The question to ask, imo, is if you alternated between 60 and 30fps every 2 seconds (that is, if the game stuttered between 30fps and 60fps) do you think there would be no noticeable difference?

How would I know if I haven't experienced that for myself? I have no idea.

I just know the comparison images on that website hardly show any difference. And I'm certainly not the only one to say that in this thread.
 
I feel like the ones who don't notice a difference, are people who don't game on a PC very often (or at all).

I once only played games on my PS3 and I remember being shown a video of that new Devil May Cry game, one on the PS3 running at 30 and the other on a PC running at 60.

I honestly could not tell the difference. I just wasn't used to playing a 60 FPS game so they both looked identical to me.

Then I actually got a decent graphics card on my PC so I started playing some PC games. After that the difference between 60 and 30 FPS was just so obvious. I would change the settings in the game so that I could achieve 60, then increase the graphical quality and get only 30 FPS and that is when I started to truly see the difference between frame rates.
 
But your eyes are still seeing it, even if it is a screen, which means it still falls under the category of real life...
I mean when you spin the camera in the game, vs. when you spin your head around. It never looks so perfectly smooth when you spin your head around in real life. I honestly think this is why the high framerates bother some people in some games, and why it so much more easily cases nausea if you watch someone play a quick moving 60FPS first person shooter on a big TV as opposed to playing something slower and at 30FPS like Halo games.

Mind you I'm not saying that 60FPS is bad or whatever, but in a way it can appear unnaturally smooth. It is regardless essential in some games. Fast scrolling games like Resogun or Superstardust looks almost comically bad at 30FPS (you can see Resogun at 30 under Remote Play on Vita, and SSDHD Co-op mode was 30FPS pre-patch). Those two games are example where I think every human being with a working pair of eyes would be able to notice a massive difference between 30 and 60, because at 30 both games looked basically broken.
 
I can tell the difference, but the 30fps vids aren't that bad. I would much rather have a higher res and all the graphical features turned on at 30fps than to make sacrifices for 60fps.
 
I think it's also worth pointing out that 30FPS videos in this comparison are probably produced by halving the framerate of the original 60FPS captured footage. Which is not a good way to make these videos, as any object based motion blur that would be in the 30FPS game would not appear in equal amount in the 60FPS footage, and halving the framerate wouldn't increase it as much as it would be in the native 30FPS footage.
 
I'm not a scientist, or a learned person at all; this could all be total, misinformed bullshit (probably is). I'm pretty sure framerate is something your brain learns to see. When I was younger, playing PAL PSX and N64 games, the idea of one game being 'smoother' than another never occurred to me. I had no idea how the box under the TV made pictures happen on the screen, no concept of what a frame was or that I might enjoy more of them per second. Later I encountered some PS2 games with 50/60Hz toggles in the options menu, and had no real idea what it meant. I remember knowing that games like DMC felt snappier than something like GTA 3, but I couldn't have elucidated any further than that.

I got a gaming PC a few years ago, and it was only through a bunch of experimentation with different games and settings, discovering FRAPS and trying to find some kind of correlation between the yellow numbers and my eyeballs' perception, that I finally clued in to framerates. Today, I'm like a cyborg with this shit. Whenever these 30/60 gifs show up I can spot them instantly, with no effort at all. It's crystal clear, like asking me to look out a window and determine whether it's day or night. I'm looking at a 144Hz monitor right now, and since upgrading to it my brain's learnt to tell 120FPS from 60 just as easily. I can't quite spot 144 from 120 in a vacuum, but in those direct comparison sites with the ball moving back and forth across the screen I can do it.

So, yeah, I'm pretty sure it's just a skill your eyes and brain learn. As another example, before I learnt to play the guitar I was completely unable to identify fake guitar playing from real. I distinctly remember watching Crossroads when I was young and being convinced that Ralph Macchio could shred even better than Steve Vai. When I watch that movie today, though, I can clearly see that he's just flailing his fingers in the right spots, because my brain knows what to look for.
 
I'm not a scientist, or a learned person at all; this could all be total, misinformed bullshit (probably is). I'm pretty sure framerate is something your brain learns to see. When I was younger, playing PAL PSX and N64 games, the idea of one game being 'smoother' than another never occurred to me. I had no idea how the box under the TV made pictures happen on the screen, no concept of what a frame was or that I might enjoy more of them per second. Later I encountered some PS2 games with 50/60Hz toggles in the options menu, and had no real idea what it meant. I remember knowing that games like DMC felt snappier than something like GTA 3, but I couldn't have elucidated any further than that.

I got a gaming PC a few years ago, and it was only through a bunch of experimentation with different games and settings, discovering FRAPS and trying to find some kind of correlation between the yellow numbers and my eyeballs' perception, that I finally clued in to framerates. Today, I'm like a cyborg with this shit. Whenever these 30/60 gifs show up I can spot them instantly, with no effort at all. It's crystal clear, like asking me to look out a window and determine whether it's day or night. I'm looking at a 144Hz monitor right now, and since upgrading to it my brain's learnt to tell 120FPS from 60 just as easily. I can't quite spot 144 from 120 in a vacuum, but in those direct comparison sites with the ball moving back and forth across the screen I can do it.

So, yeah, I'm pretty sure it's just a skill your eyes and brain learn. As another example, before I learnt to play the guitar I was completely unable to identify fake guitar playing from real. I distinctly remember watching Crossroads when I was young and being convinced that Ralph Macchio could shred even better than Steve Vai. When I watch that movie today, though, I can clearly see that he's just flailing his fingers in the right spots, because my brain knows what to look for.

Personally, I could always see the difference even as a kid. I thought games that "moved faster" looked and played better. The only thing that's changed is that now I know what causes it.
 
Personally, I could always see the difference even as a kid. I thought games that "moved faster" looked and played better. The only thing that's changed is that now I know what causes it.

Same here. It was really noticeable when I went to the arcade and saw games like Cruisin USA and Time Crisis running silky smooth compared to their console versions. Like you, I didn't know the technical reasons at the time, but I loved the way the arcade versions looked in motion.
 
There is definitely a difference between the two, but it's nowhere near as significant as people are making it out to be. It's a lot smoother, but it's not "night and day" or "a whole different world". Just the GAF hyperbole machine at work.
 
Same here. It was really noticeable when I went to the arcade and saw games like Cruisin USA and Time Crisis running silky smooth compared to their console versions. Like you, I didn't know the technical reasons at the time, but I loved the way the arcade versions looked in motion.

Going from Iss Pro Evolution to Virtua Striker 2 was quite the shock.
 
Using Chrome and I can't tell the difference. . wouldn't be surprise if the site host admits that there is no difference or that he switched them around. About to try to use another browser.
 
I can tell the difference, but I think all that this will communicate to gamers is convince them that the difference is just that small.
 
Ok on:
Dirt 3 - there is almost no difference between. I'm guessing this is terrible comparison.
BF4 - I can see the big difference.
Red Orchestra 2 - See BF4
Sleeping Dogs - See BF4.

I'm using Chrome on Windows 8.1
 
I can tell the difference, but I think all that this will communicate to gamers is convince them that the difference is just that small.
It's a pretty terrible comparison though. There's too much video compression. It's like using badly compressed JPEGs to illustrate the difference between 720p and 1080p.
 
I can see the difference, but double framerate doesn't make the game twice better (or worse)... I think the most important thing is if the controls are designed to work fine with the target framerate (on console), but of course games like pc shooters really benefit of an higher framerate =)
I also have to admit my strange fetish for bad performing games on consoles, like slowerina of time, unpatched mass effect on 360 , zone of the enders 2, slowdown of the colossus and many more; those games are good ones and they make smile when they perform really bad!! (Even if the huge slowdowns in the Z.O.E. 2 final battle almost made throw the game in the trash bin at the higher difficulty levels)
 
What someone really needs to make is a site to test yourself on whether or not you can tell the difference, because I think it'd be really interesting to know how many people really can or can't.
 
Personally, I could always see the difference even as a kid. I thought games that "moved faster" looked and played better. The only thing that's changed is that now I know what causes it.

I remember buying a super duper CRT that supported 120hz and higher at normal desktop resolutions (Sammy Syncmaster something or other IIRC).

Just flinging the arrow around on Win98 (or early XP, I don't recall now) was a revelation! Moving to an LCD was such a bummer after that monitor.
 
I'm not a scientist, or a learned person at all; this could all be total, misinformed bullshit (probably is). I'm pretty sure framerate is something your brain learns to see. When I was younger, playing PAL PSX and N64 games, the idea of one game being 'smoother' than another never occurred to me. I had no idea how the box under the TV made pictures happen on the screen, no concept of what a frame was or that I might enjoy more of them per second. Later I encountered some PS2 games with 50/60Hz toggles in the options menu, and had no real idea what it meant. I remember knowing that games like DMC felt snappier than something like GTA 3, but I couldn't have elucidated any further than that.

I got a gaming PC a few years ago, and it was only through a bunch of experimentation with different games and settings, discovering FRAPS and trying to find some kind of correlation between the yellow numbers and my eyeballs' perception, that I finally clued in to framerates. Today, I'm like a cyborg with this shit. Whenever these 30/60 gifs show up I can spot them instantly, with no effort at all. It's crystal clear, like asking me to look out a window and determine whether it's day or night. I'm looking at a 144Hz monitor right now, and since upgrading to it my brain's learnt to tell 120FPS from 60 just as easily. I can't quite spot 144 from 120 in a vacuum, but in those direct comparison sites with the ball moving back and forth across the screen I can do it.

So, yeah, I'm pretty sure it's just a skill your eyes and brain learn. As another example, before I learnt to play the guitar I was completely unable to identify fake guitar playing from real. I distinctly remember watching Crossroads when I was young and being convinced that Ralph Macchio could shred even better than Steve Vai. When I watch that movie today, though, I can clearly see that he's just flailing his fingers in the right spots, because my brain knows what to look for.
I don't believe at all it's a learned skill. I first became aware of framerates as a teenager in the '90s because low framerates in first-person games was a trigger for my migraines; I think that goes to show that we can perceive framerates even when we don't consciously know the technical terms.
 
I feel like the ones who don't notice a difference, are people who don't game on a PC very often (or at all).

I once only played games on my PS3 and I remember being shown a video of that new Devil May Cry game, one on the PS3 running at 30 and the other on a PC running at 60.

I honestly could not tell the difference. I just wasn't used to playing a 60 FPS game so they both looked identical to me.

Then I actually got a decent graphics card on my PC so I started playing some PC games. After that the difference between 60 and 30 FPS was just so obvious. I would change the settings in the game so that I could achieve 60, then increase the graphical quality and get only 30 FPS and that is when I started to truly see the difference between frame rates.

I don't own a console, and my PC has an Nvidia Titan. I have been playing PC games exclusively since I moved on from the Nintendo 64. I can easily tell the difference between 2560 x 1440 and 1920 x 1080, but I can't see any difference between the 30 fps and 60 fps samples posted here. I believe the reason for that is related to individual genetic differences and how they affect the visual cortex, rather than differences in equipment.
 
Top Bottom