The Albatross
Member
Website is completely broken on tablet.
I have a crappy laptop and hadn't read the whole thread.Win 7
i7 930
gtx 460
6gb ram
Using Chrome
I have a hard time believing that that my setup cannot render the webm appropriately.
IF that is the case I need to upgrade.
I have a crappy laptop and hadn't read the whole thread.
What am I reading here? Some of you guys crack me up.
If they can't see the difference, they wouldn't care, which means you shouldntI cry when people can't see the difference. It's massive and 30fps is just so bad in comparison. I just finished Dark Souls 2 on PC and I can't imagine having to grind through that at half the frame rate. I wouldn't have bothered. Some things are alright at 30, but it's not desirable in the least.
They aren't using webm.Win 7
i7 930
gtx 460
6gb ram
Using Chrome
I have a hard time believing that that my setup cannot render the webm appropriately.
IF that is the case I need to upgrade.
Yeah, it is insane how some people lack a grasp of simple logic (the 60 FPS video you are watching is part of "real life" and thus you can not claim that it does not look natural because it has a higher framerate than "real life"!).
You guys should try a different browser. Firefox ran them at sub 30/60 FPS for me, while Chrome was full speed.
Probably not fighting games.I can't see a difference and that's one reason I enjoy games much more than most people on this forum.
I can't see a difference and that's one reason I enjoy games much more than most people on this forum.
Sorry but the examples on that website hardly show any difference.
Some of the elitist 60-fps camp on GAF are the biggest DBs around. Why do they care if other people can't tell the difference.
Idk 60fps just seems obviously smoother to me...
The effect is more noticeable in FPS games, where the differences between frames are generally much larger (check the BF4 clips and look at the corners of the frame, where you'll find the biggest deltas). The question to ask, imo, is if you alternated between 60 and 30fps every 2 seconds (that is, if the game stuttered between 30fps and 60fps) do you think there would be no noticeable difference?
I mean when you spin the camera in the game, vs. when you spin your head around. It never looks so perfectly smooth when you spin your head around in real life. I honestly think this is why the high framerates bother some people in some games, and why it so much more easily cases nausea if you watch someone play a quick moving 60FPS first person shooter on a big TV as opposed to playing something slower and at 30FPS like Halo games.But your eyes are still seeing it, even if it is a screen, which means it still falls under the category of real life...
I'm not a scientist, or a learned person at all; this could all be total, misinformed bullshit (probably is). I'm pretty sure framerate is something your brain learns to see. When I was younger, playing PAL PSX and N64 games, the idea of one game being 'smoother' than another never occurred to me. I had no idea how the box under the TV made pictures happen on the screen, no concept of what a frame was or that I might enjoy more of them per second. Later I encountered some PS2 games with 50/60Hz toggles in the options menu, and had no real idea what it meant. I remember knowing that games like DMC felt snappier than something like GTA 3, but I couldn't have elucidated any further than that.
I got a gaming PC a few years ago, and it was only through a bunch of experimentation with different games and settings, discovering FRAPS and trying to find some kind of correlation between the yellow numbers and my eyeballs' perception, that I finally clued in to framerates. Today, I'm like a cyborg with this shit. Whenever these 30/60 gifs show up I can spot them instantly, with no effort at all. It's crystal clear, like asking me to look out a window and determine whether it's day or night. I'm looking at a 144Hz monitor right now, and since upgrading to it my brain's learnt to tell 120FPS from 60 just as easily. I can't quite spot 144 from 120 in a vacuum, but in those direct comparison sites with the ball moving back and forth across the screen I can do it.
So, yeah, I'm pretty sure it's just a skill your eyes and brain learn. As another example, before I learnt to play the guitar I was completely unable to identify fake guitar playing from real. I distinctly remember watching Crossroads when I was young and being convinced that Ralph Macchio could shred even better than Steve Vai. When I watch that movie today, though, I can clearly see that he's just flailing his fingers in the right spots, because my brain knows what to look for.
Personally, I could always see the difference even as a kid. I thought games that "moved faster" looked and played better. The only thing that's changed is that now I know what causes it.
Same here. It was really noticeable when I went to the arcade and saw games like Cruisin USA and Time Crisis running silky smooth compared to their console versions. Like you, I didn't know the technical reasons at the time, but I loved the way the arcade versions looked in motion.
It's a pretty terrible comparison though. There's too much video compression. It's like using badly compressed JPEGs to illustrate the difference between 720p and 1080p.I can tell the difference, but I think all that this will communicate to gamers is convince them that the difference is just that small.
It's a pretty terrible comparison though. There's too much video compression. It's like using badly compressed JPEGs to illustrate the difference between 720p and 1080p.
Personally, I could always see the difference even as a kid. I thought games that "moved faster" looked and played better. The only thing that's changed is that now I know what causes it.
Does the operating system matter as well? Since I'm on OS X
I still play Mario 64
I don't believe at all it's a learned skill. I first became aware of framerates as a teenager in the '90s because low framerates in first-person games was a trigger for my migraines; I think that goes to show that we can perceive framerates even when we don't consciously know the technical terms.I'm not a scientist, or a learned person at all; this could all be total, misinformed bullshit (probably is). I'm pretty sure framerate is something your brain learns to see. When I was younger, playing PAL PSX and N64 games, the idea of one game being 'smoother' than another never occurred to me. I had no idea how the box under the TV made pictures happen on the screen, no concept of what a frame was or that I might enjoy more of them per second. Later I encountered some PS2 games with 50/60Hz toggles in the options menu, and had no real idea what it meant. I remember knowing that games like DMC felt snappier than something like GTA 3, but I couldn't have elucidated any further than that.
I got a gaming PC a few years ago, and it was only through a bunch of experimentation with different games and settings, discovering FRAPS and trying to find some kind of correlation between the yellow numbers and my eyeballs' perception, that I finally clued in to framerates. Today, I'm like a cyborg with this shit. Whenever these 30/60 gifs show up I can spot them instantly, with no effort at all. It's crystal clear, like asking me to look out a window and determine whether it's day or night. I'm looking at a 144Hz monitor right now, and since upgrading to it my brain's learnt to tell 120FPS from 60 just as easily. I can't quite spot 144 from 120 in a vacuum, but in those direct comparison sites with the ball moving back and forth across the screen I can do it.
So, yeah, I'm pretty sure it's just a skill your eyes and brain learn. As another example, before I learnt to play the guitar I was completely unable to identify fake guitar playing from real. I distinctly remember watching Crossroads when I was young and being convinced that Ralph Macchio could shred even better than Steve Vai. When I watch that movie today, though, I can clearly see that he's just flailing his fingers in the right spots, because my brain knows what to look for.
I feel like the ones who don't notice a difference, are people who don't game on a PC very often (or at all).
I once only played games on my PS3 and I remember being shown a video of that new Devil May Cry game, one on the PS3 running at 30 and the other on a PC running at 60.
I honestly could not tell the difference. I just wasn't used to playing a 60 FPS game so they both looked identical to me.
Then I actually got a decent graphics card on my PC so I started playing some PC games. After that the difference between 60 and 30 FPS was just so obvious. I would change the settings in the game so that I could achieve 60, then increase the graphical quality and get only 30 FPS and that is when I started to truly see the difference between frame rates.
Mario 64 was capped at 30FPS and I'm pretty sure it had frame drops too.Which runs at 60fps most of the time...