ethomaz
Banned
Race Drivin' runs at 4fps... I know it is extreme but there are a lot of games in SNES that doesn't run at 60 or 50fps.I can't think a single 2D game being 30fps on SNES and MD and consoles before them. Maybe there are some.
Race Drivin' runs at 4fps... I know it is extreme but there are a lot of games in SNES that doesn't run at 60 or 50fps.I can't think a single 2D game being 30fps on SNES and MD and consoles before them. Maybe there are some.
Like I said, he plays Fortnite (not a graphic showcase) on newer stuff (PS5). He would play on PS4 if PS5 didn't exist, no?My cousin play these games and he screen to everybody in his school class room how Fortnite looks way better on PS5 than PS4.
He wants a PS5 just because that now.
Ask him about the framerate? lol
He is fine with whatever the game runs.
I may be wrong but animations can be 4fps. The game renders at 60.Race Drivin' runs at 4fps... I know it is extreme but there are a lot of games in SNES that doesn't run at 60 or 50fps.
I feel thx to zen2 cpus we will get 60fps option in most games this gen on ps5/xsx , on previous gen thx to really weak jaguar cores it was very hard but luckily amd could provide console makers with much more cpu power this time around, so yup, 60fps will be very common hopefully, at least thats the trend we are seeing, but like OP guessed, 60fps wont come at 0 price, cuts to res and very probably lowered settings will be commonplace tooWhat define "normal"? Because I'm sure most gamers plays on 30fps instead 60fps on consoles.
That is the normal.
Now on PC I rarely see anybody playing in 30fps... always 60fps or higher... so on PC 60fps is normal.
It run originally at 4fps in Atari too.I may be wrong but animations can be 4fps. The game renders at 60.
I dont know how many hundreds of hours I have on Driveclub and all Sony AAAs, and not once I had a headache.You can have a photorealistic game with 8K textures and incredible geometry detail, but if it's 30 fps, once you push that right stick to left or right, the entire image turns into puke. No thanks.
Adding to this reasonable post, and for truth sake, I would say casual people should get enough exposure time to both 1440@60 and 4k@30, on a good 4k TV, afrer all people have different tastes.Last gen had 60fps games in specific genres like FPS and fighting games. It was really just the usual suspects. I think people are less likely to notice the difference when those games are limited to specific genres: they'll probably tend to just attribute the smoothness of a COD to it being a shooter rather than the frame rate per se.
Virtually all big cinematic games were 30fps last gen. People didn't complain because they just accepted that was the situation - whether you knew about the weak CPUs or not, it was easy to infer that 60fps just wasn't a viable option.
This generation is different because the decent CPUs actually make it a relatively easy concession just by lowering resolution and settings. It's basically the first time since we moved to 3d visuals that this has been the case.
I would actually love to do a blind test of non-gamers to see what they think looks better. I strongly suspect it's a fallacy that the preference for fidelity at the expense of everything else is some kind of casual position. It looks much more like a graphics nerd's prerogative to me.
Obviously there's a matter of degree: I'm sure a casual player would prefer, say, Uncharted 4 at 30fps over Uncharted 3 at 60fps. But Uncharted 4 for at 1440p60 vs 4k30? I bet the vast majority would think 60 looks better.
So I'm not sure why 60fps should be the preserve of "PC Master Race". Who's to say that 4k and RT and ultra settings aren't the real nerdy features that people should have to buy a $3k PC to indulge in?
Doesn't the same apply to locked 30fps modes??
If you 'puke' its your fault, and only your
SubjectiveFidelity mode is no doubt awesome for screenshots and shares to Youtube. The ability to show zoom in and see each individual blade of grass and count each pixel is great for online discourse. But to actually opt to play the game at 30 fps over 60 fps is very questionable. Hyperbole or not you are getting a worse gaming experience at 30.
Sad and petty. Some people have different tastes than yours and it doesn't make you any better than anyone else.Sad and petty? I thought it was funny.
And I will be building a PC. 4060ti is the goal once it releases. Less inclined right now with everything being 60fps on current gen systems. If things go back to 30fps, i'll buy a prebuilt immediately if I can't get my hands on a GPU.
Most people barely understand what this is or even notice it, so I'm really doubt a lot has "seen the light" or that they even care about it.The problem is PS5 and XSX were setting expectations too high with boosting last gen games to 60 FPS. It's going to be pretty jarring going from a 60 FPS Horizon on PS5 to a 30 FPS Horizon 2. Does Horizon 2 look miles better? Sure but the play will feel sluggish until you readjust.
There are a ton of console gamers who have 'seen the light' as to why 60 FPS is superior but now will have to settle back into 30 FPS as true next gen games start coming out.
The blur is still there though. Take a screenshot of spinning around at 60fps and show me what it looks like because how can you see the details if everything is smeared?In PC games you can turn it off in a lot of cases. On consoles, absolutely not. Very few games offer the option to turn it off. Performance mode typically runs at 60 fps or higher with far less aggressive motion blur.
right, but the gains from The Matrix vs. say Horizon FW (@60) are absolutely not worth losing 30fps for.I mean if you want your games to look like The Matrix UE5 demo, then you will have to eventually go to 30fps at some point, but for now 60fps is fine and cannot be compared to the issue last gen with the Jaguar CPU causing a bottleneck that prevented even Xbox One X from achieving 60fps .
The bolded is hyperbole and I never believe these statements. Funny how the same people watch movies at 24fps and have no headache. Or puke.I dont know how many hundreds of hours I have on Driveclub and all Sony AAAs, and not once I had a headache.
If you 'puke' its your fault, and only your
Adding to this reasonable post, and for truth sake, I would say casual people should get enough exposure time to both 1440@60 and 4k@30, on a good 4k TV, afrer all people have different tastes.
One thing also, 4k with temporal antialias looks noteacibly worse than 4k with mm antialias.
Having a 4k display for blurry games kinda takes away the point of it.
Sad and petty. Some people have different tastes than yours and it doesn't make you any better than anyone else.
You crying?
I'm concerned that I hurt your feelings and am offering a tissue.You crying?
There is this one focused in First Person Shooter.I would actually love to do a blind test of non-gamers to see what they think looks better. I strongly suspect it's a fallacy that the preference for fidelity at the expense of everything else is some kind of casual position. It looks much more like a graphics nerd's prerogative to me.
You can easily test yourself using mouse (the most precise "controller") aiming on shooters locked to 30 then 60+ and see the diference in sluggishness/input lag.There is this one focused in First Person Shooter.
He focus in lower framerate... like 2.5, 3, 5, 7 and goes up to 60fps.
There are different curves for shooting and movement... in simple terms movement has smaller difference after 15fps... it gives scores so 15fps starts at 0.75 (75%) and 60fps is 1 (100%)... 30fps stay there around 85%.
Thinks changes in shooting where the curve is more equalized... 15fps received 50% score while 30fps received 75% and 60fps received 100%.
For him framerate affect more shooting than movement.
He concluded framerate unplayable are the low ones like 3fps or 7fps.
BTW he asset the previous streaming test resulted in 3-7fps being acceptable because in a First Person Shooter the framerate affect the player performance (something that doesn't happen when watching a stream).
I should love to have more tests like that.
Yeap I sense the response time difference shooting.You can easily test yourself using mouse (the most precise "controller") aiming on shooters locked to 30 then 60+ and see the diference in sluggishness.
The blur is still there though. Take a screenshot of spinning around at 60fps and show me what it looks like because how can you see the details if everything is smeared?
If you needed any more evidence that you are absolutely clueless about this subject, then let this be it.Still waiting to see someone take a spinning screenshot of uncharted 4 with performance mode and motion blur on.
Hm, it seems you're still crying. Oh, well, have some tissues. Don't let 30fps modes trigger you this much, it's not healthy.I'm concerned that I hurt your feelings and am offering a tissue.
If I wanted to cry, i'd play my games in 30fps modes.
There's blur on the images.![]()
![]()
If you needed any more evidence that you are absolutely clueless about this subject, then let this be it.
Some real lames in here acting like 30fps is some blurry mess. Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.
What a joke.
30FPS rules. Visual fidelity is infinitely more important to push the medium forward that fucking framerate.
huh, a ton of great games were always 60 FPS. what?Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.
Yup. Instead of giving us this option, they should have tried everything they could to get the game to 60 FPS and salvage the resolution and overall presentation in the process. Instead they toss in a 60 FPS mode with shitty CB and called it a day.Imagine defending 30 fps. what world are you guys living on.
30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.
Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
Why do you think graphics/visual fidelity have historically been the biggest selling point for next generation consoles over everything else? Huh??? Riddle me that.
And every single one of these games are much better at 60+Some real lames in here acting like 30fps is some blurry mess. Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.
What a joke.
30FPS rules. Visual fidelity is infinitely more important to push the medium forward that fucking framerate.
Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.And every single one of these games are much better at 60+
PS:15fps is even better. 7.5 even more...
I mean, lets take this list for example:Why do you think graphics/visual fidelity have historically been the biggest selling point for next generation consoles over everything else? Huh??? Riddle me that.
What was necessary about 30fps back in the day? It wasn't necessary just as it isn't necessary today. They could have easily cut down the graphics to reach 60fps. They don't because you get better graphics at 30fps. Some people are happy with 30fps with better graphics. Judging by the fact that most big games were 30fps I'd say most preferred better graphics at 30fps vs lower graphics at 60fps.Imagine defending 30 fps. what world are you guys living on.
30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.
Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
that didnt answer my questionI mean, lets take this list for example:
![]()
Best Video Games of All Time
See how well critics are rating the Best Video Games of All Timewww.metacritic.com
it's only an approx. but many games on there are 60 FPS.
it's completely fine to take graphical fidelity over frame rate, but it's not like the low frame rate will enhance your game.
Breath of the Wild is a fantastic game, but the best place to play it isn't on the Switch.
Why do you think graphics/visual fidelity have historically been the biggest selling point for next generation consoles over everything else?
Play a game with PS1 visuals at 120fps and Flight Simulator 2020 at 0 fps and come back here saying visuals are more important than framerate again.Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.
Horizon FW looks significantly worse at 60. That trend will continue to be more noticeable as the gen moves forward. Thank fuck these developers dont listen to framerate nerds lmao.
OoT was 20fps and is the higher one in your list. lmaoI mean, lets take this list for example:
![]()
Best Video Games of All Time
See how well critics are rating the Best Video Games of All Timewww.metacritic.com
it's only an approx. but many games on there are 60 FPS.
it's completely fine to take graphical fidelity over frame rate, but it's not like the low frame rate will enhance your game.
Breath of the Wild is a fantastic game, but the best place to play it isn't on the Switch.
then why wasn't Atari Lynx leading the handheld wars? Why didn't the Xbox win against the PS2? Why was the Wii winning against Xbox and PS3? Why is the Switch killing it right now?that didnt answer my question
![]()
Why is GTAV the best seller game this gen if visuals are the most important?that didnt answer my question
![]()
So much hyperbole. I'm honestly glad I'm not as sensitive as y'all since I regularly play older games that has sub 30fps, imagine bitching about that? Nah, I've got better things to do than screaming at devs and at the screen because a game is not 60fps lol.Imagine defending 30 fps. what world are you guys living on.
30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.
Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
and Soul Calibur and Tony Hawk were 60 FPS.OoT was 20fps and is the higher one in your list. lmao
If FS2020 is 0fps I'm not playing it I'm watching a picture. Also, FS2020 is 30fps on consoles and people use that as their go to "omg next gen graphics" game.Play a game with PS1 visuals at 120fps and Flight Simulator 2020 at 0 fps and come back here saying visuals are more important than framerate again.
And OoT is better than those two games.yes, and Soul Calibur and Tony Hawk were 60 FPS.
So you understood why framerate is more important.If FS2020 is 0fps I'm not playing it I'm watching a picture. Also, FS2020 is 30fps on consoles and people use that as their go to "omg next gen graphics" game.
You have a point, but there are certain games that feel different at 60 fps. The movement is more responsive. It feels more fluid. Demon Souls remake is the best case scenario where you can have high quality visuals at 60 fps. I'd like to see more Developers take that route, than shoot for 4k over everything else. Even Spiderman MM with 1080 raytaced reflections is a comprimise Im willing to except if the game runs at a smooth 60s. There needs to be a balance or more options to select.Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.
Horizon FW looks significantly worse at 60. That trend will continue to be more noticeable as the gen moves forward. Thank fuck these developers dont listen to framerate nerds lmao.
And OoT is better than those two games.
Tony Hawks was SUB 30 fps.
Can you post a video instead a frame picture? That way I can tell you what I see or notice.
And every single one of these games are much better at 60+
PS:15fps is even better. 7.5 even more...