60FPS vs. 30FPS vs. 20FPS difference shown w/ the help of F1 2013(pcgameshardware.de)

I was expecting a time dilation joke :|

Time Dilation is only available for gaming industry in Iceland, fucking patents ;\

---
That's not entirely true. Using this formula,
uHLwGwk.png
you will find the eye actually peaks at the magical 24FPS thus making movies the perfect display of the eye's capabilities. Anything above that is really just "Hey, look at what I can do".

Human eye simply can't register above 24fps.

Seems right and i know this stuff, because i'm an expert.
 
This video got me thinking.......

Why does it have to be 30fps OR 60fps?

Why can developers not simply drop 10fps and stick with 50fps? 45fps even?


Why is it so binary?
 
I can see the difference when it's side by side like that but I don't know if I could tell on their own.

I'm fine with not being able to tell though, don't want to ruin a bunch of games for myself.
 
The video, to me, showed a pretty big difference. The fact that the next gen consoles will not be able to hit 60 fps is a huge disappointment
That's just silly. Of course they can "hit" 60 fps. It's going to vary on a game by game basis just as it always has. We will see more 60 fps titles going forward than we did this past gen, however, I'm sure of it.

This video got me thinking.......

Why does it have to be 30fps OR 60fps?

Why can developers not simply drop 10fps and stick with 50fps? 45fps even?
The framerate must be a multiple of the refresh rate or else you end up with uneven output that results in judder. It looks awful.
 
Since we are talking about FPS, was wondering if anyone in this thread could help me out with something. Bungie said that Halo 3 is 'locked 30FPS'. Now, if you've played the game, you'll know the frame rate drops when the Scarab(s) explode. Is Bungie's claim incorrect because there is a way to actually lock the entirety of the game no matter what circumstance to 30fps or will there always be dips SOMEWHERE in a 'locked 30fps' game, even if they are minor/very rarely appear?
 
I like the extra fluidness of 60fps but my brain freaks out about. It feels like the video is in fast forward. 30fps for me. 20 is too low. If I remember right, between 30 and 24 I can't tell a difference.
 
Since we are talking about FPS, was wondering if anyone in this thread could help me out with something. Bungie said that Halo 3 is 'locked 30FPS'. Now, if you've played the game, you'll know the frame rate drops when the Scarab(s) explode. Is Bungie's claim incorrect because there is a way to actually lock the entirety of the game no matter what circumstance to 30fps or will there always be dips SOMEWHERE in a 'locked 30fps' game, even if they are minor/very rarely appear?

Locked to 30 just means it's setup so that the game won't go above 30 fps. It can and will still go below.
 
Meh, I've been disappointed with my 120Hz monitor(S23A700D). My plasma and CRT still have a noticeable advantage with motion resolution. I hear the 120Hz monitors with lightboost are better; I want to get the VG248QE.

I just recently bought a Benq XL2320TE and it had lightboost available. It really cuts the brightness down by half but you can reeeaaalllly tell the difference when running a blur busters test. Though when I tried it with CS:GO I didn't really notice between that and 144hz even with the 5ms added delay.
 
Can someone explain to me why 60 is the benchmark?

Elitist gamers with unrealistic expectations.

If all games were 1080p60 and looked not much better than current gen you can bet your arse that they are the same kind of people who would moan about not having "next gen" graphics.
 
Very interesting.

I'm one person who can tell when FPS drops in a game, but if a game is running at a steady 30FPS I really can't tell much of a difference between a 30FPS and a 60FPS game... or 20FPS stable and 30FPS stable. So that video was pretty enlightening. wish that they let the clips play out longer rather than only the very beginning of the race... Sure, you can see the lines on the ground and how they look at different FPS rates, but I wanted to see how the cars moved, which affects gameplay more than the lines you're driving over.
 
Dont really notice it when playing games. Sure I see it in this video (side by side)
But generally its not something that I notice.

I played Skyrim on 15-20 fps without noticing (until using fraps and realizing it was so low), and that was fine for me
 
Since we are talking about FPS, was wondering if anyone in this thread could help me out with something. Bungie said that Halo 3 is 'locked 30FPS'. Now, if you've played the game, you'll know the frame rate drops when the Scarab(s) explode. Is Bungie's claim incorrect because there is a way to actually lock the entirety of the game no matter what circumstance to 30fps or will there always be dips SOMEWHERE in a 'locked 30fps' game, even if they are minor/very rarely appear?

Locked Framerate only means that it will not go higher then the specified rate, it still can have dips way below 30 and still be locked (this are nearly all console games that are locked at 30 FPS). If you really want a constant 30 you need a game with locked 30 FPS and the minimum framerate has to be 30.
 
Elitist gamers with unrealistic expectations.

If all games were 1080p60 and looked not much better than current gen you can bet your arse that they are the same kind of people who would moan about not having "next gen" graphics.

I wouldn't. I think the whole industry to reset itself back a graphical generation to allow for actual HD resolution and smooth framerates. I'd be perfectly fine playing games as they look now if it meant achieving that result.

I played Skyrim on 15-20 fps without noticing (until using fraps and realizing it was so low), and that was fine for me

I... uhh... how?
 
That was my point, not everyone gives a crap about framerates as long as its playable, I never said it was a smooth experience.
Had it been any worse, i'd make the argument that the game was unplayable at times. Hence, why using it is a bad example since the game its self is slow paced and not fast action to begin with. It barely handles long periods of running with frequent heading changes as is.

Arguing that the game's tempo is slow makes for a good choice as a low FPS game is fairly weird.
 
This video got me thinking.......

Why does it have to be 30fps OR 60fps?

Why can developers not simply drop 10fps and stick with 50fps? 45fps even?


Why is it so binary?

Judder. Most displays refresh at a maximum of 60 Hz, so any framerate that doesn't divide evenly into 60 means showing frames an uneven number of times, which leads to a noticeable decrease in smoothness even if the framerate is constant.

Code:
60 FPS

1 2 3 4 5 6

30 FPS

1 1 2 2 3 3

20 FPS

1 1 1 2 2 2
[b]
45 FPS

1 2 3 3

4 5 6 6

7 8 9 9[/b]

It's basically a consistent, intermittent framerate drop. You can mitigate it with various blending/processing techniques, but you don't want to deal with it if you don't have to.
 
A lot of people don't seem to understand that most video games check for input when the frame updates. So whenever a frame is drawn and displayed onscreen, the game is checking to see if the player has pressed any buttons or manipulated any axes.

At 60 frames per second, you have 60 opportunities in a second to have your command "read" by the game. At 30, you have exactly half the number of opportunities. This is why 60 frames per second is crucial in some genres, preferable in others, and not really a big deal in a few. In a fighting game, 60 chances to block an incoming attack is going to mean a huge advantage over a comparable player playing with 30 chances.

Conversely, a game of Chessmaster can run at 5 frames per second for all its non-stop action.
 
The big difference between 60 and 30 is how well the frames blend together.

But there are other factors that contribute to that other than raw FPS, such as frame latency and intervals.

I've noticed some games at 30 that appear to be running at 60 just because of how smooth the frames blend together.
 
Why would it be ouch? I'd imagine most people would prefer the extra eye candy. I know in the majority of circumstances I would.
That would be true in plenty of cases, I'm sure, but I have to wonder whether people would say the same thing if they got to try out 60fps in whatever 30fps game they were playing. I think its often a case of not realizing how good 60fps actually is in a particular game because they've just not experienced it. Its hard to really 'imagine' what a game will be like with 60fps vs 30fps. Its something you have to get hands-on with to really feel and see the difference.

I'm sure lots of people would still choose eye candy, but I think it would sway things a bit if they could get a first-hand idea of what 60fps actually does for a game. Even watching videos doesn't do it full justice.
 
Elitist gamers with unrealistic expectations.

If all games were 1080p60 and looked not much better than current gen you can bet your arse that they are the same kind of people who would moan about not having "next gen" graphics.
I'm not an elitist gamer. The negative impact on gameplay aside, at 30fps the frames look disjointed to me, as if I was looking at the projection of a zoopraxiscope that's spinning too slowly. If other people don't suffer from this, I can understand why it doesn't bother them. It bothers me. If that means I have to avoid certain games, so be it.
 
Every game should have as high a framerate as possible. This isn't even up for debate.
KuGsj.gif


But I think most of this thread is sarcasm so we cool.
 
First video I've seen where I can tell the difference between 30 and 60 perfectly. 20FPS even looks like it's skipping a few frames.

I can go back and forth between 30 and 60 easily, but 60FPS truly does look great and smooth.
 
In my Quake 3 days with a 120hz CRT @ 640x480 i was able to easily notice framerate drops, even smaller ones (5-10 fps).

RefreshLock was a must with CRT's back then
whippersnappers
 
No I have realistic standards. 1080p60 with "next gen" graphics out of a $400 box is too much to ask.

Elitist gamers with unrealistic expectations.

If all games were 1080p60 and looked not much better than current gen you can bet your arse that they are the same kind of people who would moan about not having "next gen" graphics.

No, the people who do not care about 60fps are going to complain about the graphics. 60fps and 1080p already improves the graphics, but since the shader quality is not improved or there aren't any extra particle effects the graphics are not improved.
 
Judder. Most displays refresh at a maximum of 60 Hz, so any framerate that doesn't divide evenly into 60 means showing frames an uneven number of times, which leads to a noticeable decrease in smoothness even if the framerate is constant.

Code:
60 FPS

1 2 3 4 5 6

30 FPS

1 1 2 2 3 3

20 FPS

1 1 1 2 2 2
[b]
45 FPS

1 2 3 3

4 5 6 6

7 8 9 9[/b]

It's basically a consistent, intermittent framerate drop. You can mitigate it with various blending/processing techniques, but you don't want to deal with it if you don't have to.

Awesome explanation, thanks!

That said, doesn't that mean that if my PCs GPU can display up to a constant 45-50fps in a particular game I'd be better off locking it at 30?
 
Top Bottom