• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 30FPS was once acceptable, BIT ISN'T ANYMORE!... mostly...

Bo_Hazem

Banned
When you are a janitor you accept that you have to wear an overall outfit, but when you somehow studied and got another job as an engineer you expect an office and a good chair. Things like that happen in life, you just have to deal with it. Also you must not take any debts that you cannot pay back. It'll only sink you deeper.

Anyway back to the topic, when you go black you never go back.
 
Last edited:

Cattlyst

Member
Thanks for the detailed post, OP. As an older gamer I grew up playing 8 and 16 bit consoles and naturally progressed to the 32 bit systems. Blah blah. Anyway, I’ve never really understood the whole 60fps obsession. Like the post says, older games were never at that speed so I guess I just never cared. I remember playing Perfect Dark in N64 and that thing really struggled to maintain probably 15 fps at times and I didn’t care. Does make me laugh when I see younger gamers complaining if a game isn’t 60fps 🤷
 

nkarafo

Member
30 fps was only acceptable in the PS1/SAT/N64 generation. That's because 3D texture graphics were new and being able to run them at 30fps was a good start. Otherwise all generations prior to that had 60fps as the standard. And the generation after that had more 60fps games but later on everything went down the drain. 30fps is only "acceptable" now because the majority of average joes play slow paced cinematic games that don't even need to be responsive at all.

The motion blur is a result of the crappy "sample and hold" LCD technology. CRTs don't have that so they don't suffer from this. That's why the motion in CRTs is always crystal clear. You need more than 240hz to achieve similar motion clarity on a high refresh rate monitor. And a game NEEDS to run at 240+ fps too, a 60fps game on a 240hz monitor will still be a blurry mess (unless you enable other features like black frame insertion, which can have it's own artifacrts)

In conclusion, standards of gaming are lower. People seem to care more about how good something looks as a static image instead of in motion. Sad.
 
Last edited:

rofif

Can’t Git Gud
Yeah it’s input lag. Demons souls 30fps mode for 75ms more input lag than bloodborne. Seriously install both and test it. Bb feels great.
It’s the modes. 30fps needs to be properly implemented. Not vsynced crap.
Uncharted4 does it well. Good input lag and heavy high quality motion blur. Forbidden west 30fps also feels very good but could use better motion blur. Oled is great because it got no lag but requires great motion blur. Otherwise it looks jerky. Uncharted4 got that good motion blur.

So you are completely wrong about motion blur. It is essential and required for oled. It adds smoothness in motion. Each frame should be a capture of time that passed. Not a still.
Don’t mix this with fsr and visual artifacts

Remember that 60 to 30 fps is only 16ms difference. No reason for games to have 100ms worse input lag like some do

Again. Motion blur is essential to avoid jerky motion and stutter on oled. Unless you are running 240hz then you got enough frame data
 
Last edited:

PeteBull

Member
30 vs 60fps, all im gonna say is, as an european, i remember well late 90s and having access to pirated 60fps ntsc tekken 3 version, then buying legit retail pal copy that only ran in 50fps- it was like game was playing underwater, thats how slow it fellt(not just looked, fellt)

And thats just 50 vs 60fps, not even 30 vs 60 like u trying to imply is similar/same, its humongous difference, man, in any game that requires fast reaction time- so not turn based rpg/strategies and such ;P

On top remember graphics moved forward a bunch vs 90s , so dimnishing returns hitting us hard, those games in 60fps modes look only bit worse vs their fidelity/30fps modes, as confirmed by true lookers: ratchet, spiderman/miles morales, demons souls, forza horizon 5, horizon forbidden west, ragnarok, for details watch digital foundry vids about them, when usually they have to have some srs close up to show up the difference.
 

Go_Ly_Dow

Member
With soaring development costs, devs should save all the extra fancy features for PC and ensure that all console versions release with a 4k 30fps mode and minimum (doesn't have to be native but have solid upscaling / reconstruction solution) and 1440p 60fps minimum. 40fps as the middleground.
 
Last edited:

Topher

Identifies as young
OH NO, MODS! there's a typo in the title :messenger_loudly_crying: BUT, not BIT... fuck :messenger_tears_of_joy:

Considering your user name...

Animated GIF
 

Cyborg

Member
I follow this rule

- if there is a little difference in graphics between 60 and 30 fps I play the game on 60.
- if there is a big difference in graphics I prefer 30 fps
- if there is a big difference between 60 and 30 fps (in graphics) but there is an option to use 40fps and get better graphics I choose that one.
 
Last edited:

rofif

Can’t Git Gud
30 vs 60fps, all im gonna say is, as an european, i remember well late 90s and having access to pirated 60fps ntsc tekken 3 version, then buying legit retail pal copy that only ran in 50fps- it was like game was playing underwater, thats how slow it fellt(not just looked, fellt)

And thats just 50 vs 60fps, not even 30 vs 60 like u trying to imply is similar/same, its humongous difference, man, in any game that requires fast reaction time- so not turn based rpg/strategies and such ;P

On top remember graphics moved forward a bunch vs 90s , so dimnishing returns hitting us hard, those games in 60fps modes look only bit worse vs their fidelity/30fps modes, as confirmed by true lookers: ratchet, spiderman/miles morales, demons souls, forza horizon 5, horizon forbidden west, ragnarok, for details watch digital foundry vids about them, when usually they have to have some srs close up to show up the difference.

50vs 60 was different because games slowed down. It wasn’t done properly in ps1. Time calculation was tied to fps
 

PeteBull

Member
50vs 60 was different because games slowed down. It wasn’t done properly in ps1. Time calculation was tied to fps
True, but my point still stands, if u can have game running 2x as smoothly looking few % worse u will always go for that resolution in motion, u would really have to have massive visual downgrade to consider 30fps mode, provided ofc its 30fps stable vs 60fps stable, not 60fps target with nasty dips like for example GoW 2016 had on ps4pr0.

Even back in playstation 1 era difference between 50/60 and 25/30fps games werent that crazy, for example soul blade/soul edge was 3d fighter running in only 25/30fps and it didnt look better from tekken3,(tekken1/2 looked ofc much worse but thats coz devs were just starting out their 3d poligonal adventures;p).


Edit: For fairness tekken2 psx, it does look worse
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Devs just need to be better making 30fps games.
Need for Speed Hot Pursuit '10 is 30fps.......I had no idea it was 30fps till I got the PC version.

Clearly Criterion were doing something under the hood that made the game feel and look better than other games at 30fps
 

Magik85

Member
60fps all day.
40fps is acceptable but 30fps gives me actual headache.
Years ago i could play with 30fps but somehow nowdays id rather to skip the game.
I doubt think its just about me getting used to it. Back then most of us had smaller TVs. Screen size makes quite a difference in my perception of smoothness. Switching to oled also didnt help.
 

Bojji

Member
READ THE FUCKING POST BEFORE POSTING NON-ARGUMENTS PLEASE.

This thread is inspired by some of the comments I read in this thread:
where people talk about playing old games that ran at super low framerates, and having no issues with that...
An I too am one of them, I even just recently played through Disaster Report on XBSX2, a PS2 game that has a 20fps lock, and even drops below that 20fps lock in some scenes...
I had no issues playing it... but why do I hate 30fps in modern games, when I play 20fps retro games just fine?

The quick explanation would be:
1: Framerate is not the issue.
2: Input lag, and visual noise is.

Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.


But I will elaborate on these a bit, otherwise what an OP would this be? :D


First of all, input lag:
Many modern engines have an insufferable amount of input lag, all while the games aren't designed with them in mind either.
Unreal Engine is especially bad with this, but is by far not the only one... God of War's current engine has the same issue, as does Cryengine and many others.
They don't inherently have these issues, but the way almost all developers use them is the issue, as they rely on many of the default way these engines read inputs, render images and Vsync games.

Ocarina of Time on N64 had less input lag at 20fps, than almost any modern game has at 30fps... and for some even the 60fps modes have higher input lag than 20fps Ocarina of Time... this is not an exaggeration, it's absolutely true.

And this is not down to modern TVs either, in fact, if you have a modern Samsung TV from the last 2 production years for example, and you play at 120fps on it, your Input Lag of your Screen is lower than the latency of a CRT... because yes CRTs also have input lag, this lag comes from how they draw an image from top to bottom 60 times each second, which means 1 image takes 16.6 ms to be drawn by a CRT, input lag is measured at the center of the image usually, meaning a CRT given those standards would have input latency of at least 8.3ms of input lag. While a modern 120hz Samsung TV has ~5ms of input lag... and about 9ms at 60hz.
The CRT Input Lag problem basically has ben solved now, we are so close to that even on TV screens now that it's not a factor anmore, let alone on high end PC monitors.

And this increase in latency isn't only obvious when comparing super old games to super new games.
New games still have a shitload of variation, with games like Call of Duty Modern Warfare 2/Warzone 2.0 having input lag so low, that they compete with old SNES games!
We are talking ~40ms of latency at 60fps, and below 30ms at 120fps, which is lower than most 8 and 16 bit games ever dreamt of reaching.
We could compare some Xbox 360 games that ran at 30fps to some modern games that run at 60fps, and Halo 3 would win... God of War Ragnarök at 60fps funnily enough has about the same latency as Halo 3 at 30fps, which is around 90ms for both (+/- 5ms)

So even tho there are modern games that absolutely crush it when it comes to latency, like Call of Duty, and there are of course old games like Killzone 2 that were infamous due to their high latency, it's sadly a pattern that latency is going up.
We are now in a spot where a bulk of modern titles have the same, or even more Input Lag than Killzone 2 had, a game that was panned at the time for how awful the aiming feels due to it's lag.
Coming back to God of War Ragnarök. That game in its 30fps Graphics Mode, has an input latency of around 170ms. Killzone 2's latency was in the 150ms ballpark!!!

So let that sink in for a moment... during Gen 7, Killzone 2 got massive bad PR for having 150ms of latency... and a modern game like God of War Ragnarök easily exceeds that!
Meanwhile Halo 3 had 90ms of latency at 30fps on an Xbox 360, and Ragnarök has about the same amount of latency in its 60fps Performance Mode.
In God of War's case it's mostly the Vsync that is the issue it seems, since as soon as you use the unlocked VRR mode that deactivates Vsync, the latency shrinks down to 90ms, in par with Halo 3 and the 60fps Vsync mode of the game.

Why does Vsync introduce input lag? Because it pre-renderes (buffers) frames. And some games take this to another level, usually to smooth out framerates due to being extremely demanding on the GPU, and they will hold multiple frames in the back buffer before displaying a new one, which gives them the possibility to basically have a fallback frame under most circumstances, which keeps the percieved framerate consistent at the cost of input lag.
Sea of Thieves in fact is the only game I know of that actually lets the user chose how many frames it should pre-render.


Secondly, how games build their final image:
Modern games have tons of high frequency details, and modern games are expected to be played on high resolution displays.
These 2 factors are the reason many games now use aggressive TAA, and more and more games use one of the many forms of Upsampling to higher resolutions from a lower base resolution.

These 2 things both lead to the same issue, a muddy and hard to parse image in motion.

Then a third factor comes in, MOTION BLUR *scary thunder noises*
And Motion Blur adds even more visual noise and muddies the final image even more.

So a modern game will often look pristine and super clean when you hold your camera completely still, but as soon as your character, and especially the camera, moves at all... MUD AND SMEARING.
And funnily enough we have an old game as an example here again, that at the time got a lot of flak for having the same issues, that nowadays are so common that noon even really talks about them in reviews or anything anymore...
And that game is Halo Reach.

Halo Reach was an early example of a game that used a form of Temporal Anti Aliasing... TAA.
And TAA at the time Halo Reach was made, was still firmly in its infancy, exhibiting a lot of issues and visual noise. So if you stood completely still in Halo Reach, it looked GLORIOUSLY smooth and clean... move the camera and it was a jittery mess.

These days TAA got a lot better, but it still has issues with ghosting and clear fizzle and disocclusion artifacts the moment something on screen moves.
But of course, TAA isn't the shiny new thing anymore... we now have FSR2, TSR and many other proprietary methods by different developers...
And these Upsampling methods basically bring bag almost all the issues Halo Reach had, and then some!

When you play Ocarina of Time at 20fps on an N64 connected to a CRT, motion is clean! You can see motion clearly, there's no artifacts from anything moving too fast, there is no blur to reduce legibility of the action happening around you, and there's no motion blur to try to make it seem smoother than it is.

"but why is this a 30fps problem mister Y" is what you are now asking I bet!
WELL, TAA, FSR2, TSR, CBR, ADD UPSAMPLING METHOD HERE... and Motion Blur, all of these rely on image data from previous frames!
THE LESS IMAGE DATA, THE MUDDIER THE IMAGE IS IN MOTION!
So, at 30fps these modern upsampling and anti aliasing methods have less temporal data available to them to create their final image!
A game running at 1440p FSR2 Quality Mode will look sharper and have less artifacts when running at 60fps than it would have at the same settings but locked to 30fps.
So the lower the framerate, the more "smear" the more "mud" and the more "fizzle" there is.

So in the end, all of this adds up.
The Motion Blur, the Image Reconstruction, the Antialiasing, all of it get worse the lower the framerate is.

This is why Jedi Survivor in motion looks like this (I purposefully took a shot from a spot where the game locks to 60fps to give it the best chances):
starwarsjedisurvivorm7ifc.png


and F-Zero GX like this (I'm turning sharp left here to give it the worst chances with fast camera and vehicle movement):
fzeroscreend5cph.png



Normalized for same image size so you don't have to zoom as much on mobile (not zoomed in in any way, just cropped)
swscreenzoomxlfwx.png
fzeroscreenzoom7ocx7.png



And the lower the framerate, the worse all the weird grainy, blurry, fizzle, artifacts you see in Jedi Survivor will get. Meanwhile if you ran F-Zero GX at 10fps, it would look literally the exact same in terms of image quality and artifacts (because it has no artifacts)
And we haven't even touched motion blur, which is off in my screenshot here, along will all the other image filters.

If a game like Jedi Survivor runs at a lower framerate, the amount of data the engine has to draw the image from one frame to the next is reduced, the image gets less readable and muddy, and you will see blur in motion, even tho motion blur is off.


Smaller issues that aren't technical but due to Art Design of games:
Older games have different considerations when it comes to art (textures, models, scale) than modern games.
Older consoles couldn't have as much detail than newer consoles can handle. This means textures had less "visual noise", meaning there is less for your eyes to get hung up on, less to process for your brain.
In motion, when running through a level, or turning a camera, this means the things that are important, or could be important, as well as your general environment, is easier to digest.

You will have an easier time knowing where enemies are, or where a button or rope is, when it's simpler in design, bigger in scale and stands out more. And all of these features are features older graphics had!
On a wall with a simple flat texture, no shadows or reflections, seeing a red button that is on top of that unrealistically big in scale, is easier to see than a red button with realistic scale, on a wall with detailed textures, shadows of adjacent objects on it and other lighting applied.

So even if older games had the graphical issues I explained above, the simpler graphics would still make it way easier on your eyes, because there is not as much small detail that can get lost in all the blur, fizzle and breakup.

In short: Less Detail + Scale = Easier to read environments even at low framerates.



SO IN CONCLUSION:
Playing a game like Ocarina of Time will need getting used to at first, but once your eyes are adjusted to the 20fps output, the image will look pretty decent and clean.
Playing Jedi Survivor at 60fps already makes the whole image look muddy and blurry due to the artifacting of the upsamling. At 30fps this will only get worse, add Motion Blur to negate the stutter and you'll see barely anything in motion.

Playing Ocarina of Time at 20fps on real hardware will feel... not amazingly responsive, but more responsive than many modern games like... Jedi Survivor... or Redfall... or God of War Ragnarök feel at 30fps, hell some of them even at 60fps!

So you can not use an old game and point the finger at it saying "LOOK! back then we had no issues with this! this also ran at a super low framerate!", while not taking into account how modern game engines work, how they react to player input, and how they construct their final output image.

Old games had simple graphics, they were easier to parse by your eyes because important stuff was exaggerated in scale and shape, they didn't have motion blut, no reconstruction artifacts, no TAA or Screen Space effects that lag behind and take time to accumulate data. They had simpler engines with less latency.
And that is why they look cleaner and play better at low framerates than new games.

And yes, there are modern games and old games that are outliers from this. Like some modern ones like From Software games have pretty low latency, and some old games like Mortal Kombat on GameBoy have massive amounts of lag.
And some old games had weird fake motion blur which made the image hard to read, while some modern games have clean images with barely anything to muddy them in motion.
The difference is that there are more of the less good examples today, and more of the good examples in the past.

Exactly OP, input lag is the worst offender, it can be horrendous in modern games.

Devs just need to be better making 30fps games.

They need to just target 60fps from the start, or offer 40fps modes in every 30fps game so they can become playable on modern displays.

Didn't read but um.. Bloodborne is really hard for me to complete bc of the framerate. There I said it

Bloodborne is surprisingly good to play, from has fucked up frame pacing in their games but input lag is low for 30fps games.
 
OP you are correct. Modern games have way higher input lag compared to the older games. I remember playing GTA3 on PS3 and I could easily aim despite game running at 23-25fps, while on PS4 I was forced to play with auto aim, because free aiming felt extremely floaty. The PS4 version runs at locked 30fps but input lag was way higher compared to the PS3 version.

On PC I always play without VSYNC becasue I dont need it on my VRR monitor. Without VSYNC even something like 40fps feels already amazing to me.

But display technology is also important. On my VT30 plasma even 30fps games have acceptable motion clarity, so I can play PS4 games at 30fps and still have a acceptable experience. What's also interesting, my VT30 plasma at 60fps has BETTER motion clarity than my LCD at 170Hz.
 
Last edited:

ReBurn

Gold Member
Thanks for the detailed post, OP. As an older gamer I grew up playing 8 and 16 bit consoles and naturally progressed to the 32 bit systems. Blah blah. Anyway, I’ve never really understood the whole 60fps obsession. Like the post says, older games were never at that speed so I guess I just never cared. I remember playing Perfect Dark in N64 and that thing really struggled to maintain probably 15 fps at times and I didn’t care. Does make me laugh when I see younger gamers complaining if a game isn’t 60fps 🤷
We didn't talk about framerates as much during the 8/16 bit sprite and tile console days because they output images differently than when things started moving to 3d polygons in the 32/64 bit era. Because of how CRT TV's worked back then consoles always output a signal at the refresh rate the TV they were connected to supported because an image wouldn't display otherwise. So we technically saw the 60 fps equivalent for most games on those old machines (in NTSC anyway), we just didn't know it. Games would slow down if they couldn't recalculate sprite movement in sync with the TV, but they didn't drop frames because most of those machines didn't have enough memory to buffer frames in the first place.

The shift to 3D polygons with the 32/64 bit consoles was awkward. In a some ways those chuggy, wobbly games felt like a step backward, but the new gameplay mechanics they introduced made the growing pains worth it. I think that with consoles implementing true frame buffers and TV tech changing to standardize on a progressive scan model where visuals are literally a slideshow now the FPS conversation became more poignant than the old days. That's why folks focus on it so much now. Falling out of sync with the display is much less forgiving than it used to be.
 

Hunnybun

Member
I don't, because this is the usual thread that ignores the simple fact that frame rate perception is a very personal thing and replaces that fact with the self-centered approach that what is unacceptable for you must be unacceptable for everyone.

We've had a gazillion of these threads, here, and elsewhere. You're not making any new argument, and like everyone else who made a thread like this before, you fail to understand that this is a subjective matter that depends entirely on how each individual's brain is wired.

I'm sure people definitely do differ in their ability to perceive different frame rates. I can't really tell much above 80 or so, for example.

The part I struggle with is the idea that people can't see much of a difference between 30 and 60. That seems.... bizarre.
 
30 fps was only acceptable in the PS1/SAT/N64 generation. That's because 3D texture graphics were new and being able to run them at 30fps was a good start. Otherwise all generations prior to that had 60fps as the standard. And the generation after that had more 60fps games but later on everything went down the drain. 30fps is only "acceptable" now because the majority of average joes play slow paced cinematic games that don't even need to be responsive at all.

The motion blur is a result of the crappy "sample and hold" LCD technology. CRTs don't have that so they don't suffer from this. That's why the motion in CRTs is always crystal clear. You need more than 240hz to achieve similar motion clarity on a high refresh rate monitor. And a game NEEDS to run at 240+ fps too, a 60fps game on a 240hz monitor will still be a blurry mess (unless you enable other features like black frame insertion, which can have it's own artifacrts)

In conclusion, standards of gaming are lower. People seem to care more about how good something looks as a static image instead of in motion. Sad.
On sample and hold display you need 1000fps to get similar motion clarity to CRT, and "only" 240fps if you want to match the best plasma like my VT30.



 
Last edited:

rofif

Can’t Git Gud
There is no definitive answer to this. Some likes motion blur some don't.
It's not about liking blur or not.
The fact is only one. Motion blur is essential and needed for sub 240-300hz content.
Your eyes dont capture photos. Monitor don't mimic motion. It just displays photos.
Photos do not create momentum needed for real life motion blur to occur. That's why you need hundreds of frames or motion blur.

If You capture a picture of waterfall with high shutter speed (low motion blur), it will look nothing like real life. it will be frozen in time with droplets in the air. Or helicopter blades stationary.
The whole idea is to capture as much motion in each frame, that happened/could happen in time that passed between frames:

So for 60fps, each frame should contain 16ms of motion in it. and for 30fps, 33ms of motion. This helps to cheat the brain with the additional motion data that is not created from displaying 30 still images one after another.

Now.... if someone doesn't like ugly motion blur implemented in 75% of games? That's absolutely fine. Only some games have the nice, high quality motion per object blur.
 
I love my ST50 plasma. I also had a VT50 prior to my OLED. Sad I sold it.
I also had GT60 (it's now broken due to a lightning strike). VT30 has a little bit worse picture quality, but GT60 has twice as much input lag, so IMO VT30 is the best plasma for gaming.

You can also get amazing motion quality on OLED, but you have to use BFI. Unfortunately, BFI halves the brightness, so you need extremely bright OLEDs to compensate for the loss of brightness.
 
Last edited:

Rea

Member
Some of the modern games in 60fps mode looks so shit due to low res. 30fps looks so sharp and good even in motion, but looks so stuttery on my OLED. Never seen the engine which give more frames and higher visual, most of the time performance mode looks worse than Quality due to console GPU are not fast enough.
 

CGNoire

Member
Need for Speed Hot Pursuit '10 is 30fps.......I had no idea it was 30fps till I got the PC version.

Clearly Criterion were doing something under the hood that made the game feel and look better than other games at 30fps
The controls of NFS2010 where calculated at 60fps behind the 30fps visuals.
I think DF covered that on release. It definitely felt smoother than exspected.
 

CGNoire

Member
I love my ST50 plasma. I also had a VT50 prior to my OLED. Sad I sold it.

I also had GT60 (it's now broken due to a lightning strike). VT30 has a little bit worse picture quality, but GT60 has twice as big input lag, so IMO VT30 is the best plasma for gaming.
Got one of the final Panasonic Viera models made in 2012 GT-65 or something like that. Wouldnt sell it for the world the motion reslution more than makes up for the 1080p res.

Plasma 4 Life my Brothers.......or until MicroLED.
 
Last edited:

01011001

Banned
Its acceptable in the same way a discount hamburger is acceptable at McDonald’s.

With TAA blur, 30 fps is a soupy mess in movement. I absolutely can’t stand it.

hence why it WAS acceptable.
that's my whole point. before the days of temporal data being used for stuff like TAA and FSR2 it was a completely different thing to be 30fps or even lower. but now it will make any game a blurry mess
 

rodrigolfp

Haptic Gamepads 4 Life
Give up, my dude. After all the 10/10 and fans of RDR2, this issue about 30fps and/or high input lag being unacceptable is a lost case.
 

8BiTw0LF

Consoomer
Yeah it’s input lag. Demons souls 30fps mode for 75ms more input lag than bloodborne. Seriously install both and test it. Bb feels great.
It’s the modes. 30fps needs to be properly implemented. Not vsynced crap.
Uncharted4 does it well. Good input lag and heavy high quality motion blur. Forbidden west 30fps also feels very good but could use better motion blur. Oled is great because it got no lag but requires great motion blur. Otherwise it looks jerky. Uncharted4 got that good motion blur.

So you are completely wrong about motion blur. It is essential and required for oled. It adds smoothness in motion. Each frame should be a capture of time that passed. Not a still.
Don’t mix this with fsr and visual artifacts

Remember that 60 to 30 fps is only 16ms difference. No reason for games to have 100ms worse input lag like some do

Again. Motion blur is essential to avoid jerky motion and stutter on oled. Unless you are running 240hz then you got enough frame data

Do you really want to go down this path again? :messenger_tears_of_joy:

Stop using a video recorded in 25fps! Makes absolutely no sense whatsoever!

Ej5pvmi.png
 

cireza

Member
Modern technology in TV adds a ton of motion blur.

This is the biggest issue to fix before anything else. You can plug your GameCube on a HD TV and rotate the camera : it will be an unplayable mess. While it will be crystal clear on CRT.

Can't believe how much we lost in the transition honestly. What a fucking joke.
 
Last edited:

8BiTw0LF

Consoomer
Do you really want to go down this path again? :messenger_tears_of_joy:

Stop using a video recorded in 25fps! Makes absolutely no sense whatsoever!

Ej5pvmi.png

Edit: If you want to prove a point (which you can't) then try this: https://frames-per-second.appspot.com/

You'll see higher velocity will make the moon blurry - like in real life when your eyes can't catch the speed of objects. There's no need to add blur and 30fps is indeed more cinematic (25fps) than 120fps, but games are not movies.
 

Larxia

Member
It was accepted, but that doesn't make it good.
This really.
We can come up with all type of explanations about how image are rendered differently and whatnot, in the end the biggest factor is just that people now got used to 60 fps and can't go back.
There used to be a lot of console players back then who made fun of PC gamers wanting 60 fps everywhere, you often used to read the usual "human eye can't even see past 30 fps" bullshit etc, it used to be so common, but now that people got used to it, they realize how hard it is to go back.

Getting use to more comfort can kind of be a curse, because it makes it hard to enjoy less comfortable things anymore.

I'm sure a lot of people who were fine with their ps3 games running at 25-30 fps, claiming it was running perfectly fine in 2009, would find it horrible to play now if they tried it again.
 

rofif

Can’t Git Gud
Do you really want to go down this path again? :messenger_tears_of_joy:

Stop using a video recorded in 25fps! Makes absolutely no sense whatsoever!

Ej5pvmi.png
Doesn't matter if it's 25, 30 or 60fps.
The principle stays the same. Just lower amounts of motion blur, the more fps you get is needed
 

yamaci17

Member
40 fps for me

also 30 40 fps input lag problem do not exist on pc thanks to vrr

it is true that even today on pc u can get less input lag 30 fps than most games running at 60 fps on consoles. i did this test with forza 5 on a series S. their 60 fps mod had more input lag than my 30fps experience on PC (i dont get 30 fps in fh5, just did that for experiment. driving at 30 fps on pc was snappier and noticably more accurate than it was on console. vsync input lag is huge)

i also compared gow on ps4 vs pc at 30 fps and... pc 30 fps just wrecks it. super snappy. reflex is not even needed, just pure 30 fps lock + vrr was all PC needed to gain huge advantage on input lag front. im not saying ps4 was unplayable but it as an eye opener

on the other hand, vsync is borderline unusable on PC. it is catastrophic. if you think console vsync is bad, PC vsync at 30 fps is multiple times worse. I'm glad vrr exists
 
Last edited:

BootsLoader

Banned
It may depend on the game and how dev build the game. Most of the time I don’t have any problem with 30 FPS, I’ll prefer 60 FPS in games like Devil May Cry for example.
 

hussar16

Member
its honestly the tv Tech, back in crt days 30 fps looked great clear and smooth on any tv, nowadays you need to game on game mode and use all sort of tricks to lower the blur .if we stil had crt tvs or plasma we would not need 60fps.
 

[Sigma]

Member
On sample and hold display you need 1000fps to get similar motion clarity to CRT, and "only" 240fps if you want to match the best plasma like my VT30.




I love this man's channel(been subscribed to him) because he's so passionate about that motion clarity and mirror's my complaints towards it.
 
Last edited:

rofif

Can’t Git Gud
You can't show a video in 25fps, demonstrating anything above 25fps. Let it sink in.
I am not doing that, only showing the shutter speed motion blur principle.
launch 30fps game like uncharted 4 and do that test there in 30fps mode its the same....
 

Killer8

Gold Member
Need for Speed Hot Pursuit '10 is 30fps.......I had no idea it was 30fps till I got the PC version.

Clearly Criterion were doing something under the hood that made the game feel and look better than other games at 30fps

There was an interview on DF where the developers talked about the latency.

The follow up latency test DF did confirmed that Criterion managed to get the input lag down to 83ms - equivalent to the amount seen in Wipeout HD running at 60fps.
 

OCASM

Banned
I like motion blur but it's true that image quality and input lag are god awful in modern games.

Ricky Gervais Lol GIF


Nah, OP. Just no. Played Ocarina back in the day and it was not a significantly better technical experience when compared with Jedi Survivor at 30fps. The only difference is that now people whine and complain about perfectly acceptable framerates because they've decided that 60fps is some sort of gold standard where anything below that number is now literally unplayable. I mean, just look at how much text you needed to use to justify an absolutely ridiculous idea.

Maybe this quote from your essay will help put things into perspective. I made some helpful corrections (shown in bold, the excised sections shown with strikethrough).

Playing a game like Ocarina of Time Jedi Survivor will need getting used to at first, but once your eyes are adjusted to the 20fps 30fps output, the image will look pretty decent and clean.
Playing Jedi Survivor Literally ANY N64 title (like OOT, for example) at 60fps whatever wildly erratic fps the poor machine could churn out already makes the whole image look muddy and blurry due to the artifacting of the upsamling low resolution and smeary Vaseline-like garbage output that mascaraed as anti-aliasing.


Our games look the best that they ever have. Image quality is amazing and performance is about the same as it always was.
You don't care about quality and you're happy with whatever trash developers shove in your face nowadays. Good for you! (y)
 
Top Bottom