Minsc
Gold Member
Maybe that's why I don't see a difference, I don't play racing games
What about pinball games? You'd notice there 100% for sure. 30fps in pinball is a death sentence.
Maybe that's why I don't see a difference, I don't play racing games
It is, because higher frames are necessary to compensate for the fact you're playing on technology inherently introducing blur absent in the source.No it’s not motion clarity is just insane.
Outside VR and competitive gaming, more than 60 is a waste of resources.
The original PlayStation had quite a bit of 60fps games, shockingly. N64 on the other hand, you were lucky to see 20-25fps from most 3rd parties.I'm fine with a quality/perf mode because some games don't really need 30fps. It's like the myth that most old 3D games were 60fps. Bunch of liars lol, I grew up with early 3D and 60fps was a pipe dream.
I played it on Yuzu that way and it made for a FAR FAR FAR superior experience than what was provided on the Switch. 60 fps made it a completely new game.But what if 60fps seriously compromises game design? E.g. Could Tears of the Kingdom exist on the Switch at 60fps?
It is, because higher frames are necessary to compensate for the fact you're playing on technology inherently introducing blur absent in the source.
You would actually need 1,000fps@1,000hz to simply achieve the motion clarity of a 60fps on CRT.
Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs
I bet for a lot of you this is going to be entirely new information because it was for me when I first read this article back in 2018 https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/ Read this article if you truly want to know what...www.neogaf.com
A late Panasonic plasma also has a naturally and completely blur-free image (and full native motion res), reason why 60fps signals on them look so insanely perfect.
This user is right:
Sacrificing gaming to make up for insane limitations of modern panels is sad. If majority of gamers weren't playing on LCD everyone would agree more than 60fps is a waste.
Thank you. I hate when people bring this up. 30 fps on CRTs are vastly superior experiences than 30 fps on LCD. CRTs have been gone for a long while so it's for people to not know/forget this.To be fair, when those consoles were mainstream the displays weren't the sample-and-hold ones of today that suffer the lower the fps is.
You have not lived unless you played 4 player slaps only Goldeneye at 12fpsThe original PlayStation had quite a bit of 60fps games, shockingly. N64 on the other hand, you were lucky to see 20-25fps from most 3rd parties.
PC suffered as well, which introduced 3D accelerated cards to alleviate those 15fps.
OLEDs actually make 30 fps worse than LCDs because the fast response times make the sample and hold that much more obvious. It looks like a stutter fest. After a while your eyes do adapt and it does get better, but LCDs at times can handle 30 fps better, due to the natural blur.Don't OLEDs fix motion clarity tremendously? They have like 1,000,000 of a second response time and similarly fast screen redraws. One of the biggest improvements I notice from most LCD devices moving to OLED is definitely motion clarity. But I think that perfect motion clarity OLEDs can give also causes low framerates to stand out more, at least that's my theory.
Naw and their fast pixel response means they have even more stutter than modern LED TVs.Don't OLEDs fix motion clarity tremendously? They have like 1,000,000 of a second response time and similarly fast screen redraws. One of the biggest improvements I notice from most LCD devices moving to OLED is definitely motion clarity. But I think that perfect motion clarity OLEDs can give also causes low framerates to stand out more, at least that's my theory.
The original PlayStation had quite a bit of 60fps games, shockingly. N64 on the other hand, you were lucky to see 20-25fps from most 3rd parties.
PC suffered as well, which introduced 3D accelerated cards to alleviate those 15fps.
People like me have begged Nintendo for years to put out a higher more expensive console to play their games with higher rez and FPS. The best part is that they wouldn't need to have hardware specs anywhere near as high as PS or Xbox to achieve that on their games.
This thread actually brought back memories of running Voodoo 2's in SLI and the Matrox Millenium and other GPUs from the past! Good stuff.
Compared to some LCD, yes, but nowhere near ideal or on par with what we had in the past.Don't OLEDs fix motion clarity tremendously? They have like 1,000,000 of a second response time and similarly fast screen redraws. One of the biggest improvements I notice from most LCD devices moving to OLED is definitely motion clarity. But I think that perfect motion clarity OLEDs can give also causes low framerates to stand out more, at least that's my theory.
Lol I'am not one of those muppets that can't tell difference.I played it on Yuzu that way and it made for a FAR FAR FAR superior experience than what was provided on the Switch. 60 fps made it a completely new game.
I know that is not what you are saying, and no. 30 fps should be allowed for any developer if they choose. But I can choose to not play it, which is why I don't buy Nintendo.
People like me have begged Nintendo for years to put out a higher more expensive console to play their games with higher rez and FPS. The best part is that they wouldn't need to have hardware specs anywhere near as high as PS or Xbox to achieve that on their games.
I really think it is. I recently bought a Switch Lite and I've been playing a lot of games @30 fps. And I own a good gaming PC.It really isn't.
Go play Halo on the OG xbox or the old COD's then come back and say its fine.
Compared to some LCD, yes, but nowhere near ideal or on par with what we had in the past.
Last I checked native motion res was some absolutely laughable figure like 300p, on 4K panels (!)..
This is absolutely ridiculous compared to the full motion res you get on a Panasonic plasma for instance.
Double standards ftw!I can accept a minimum of 40 fps on my Steam deck.
I really think it is. I recently bought a Switch Lite and I've been playing a lot of games @30 fps. And I own a good gaming PC.
30 fps is noticeable at first but that's about it. Calling it 'unplayable' is just pure hyperbole.
Most of you are overreacting.
I really think it is. I recently bought a Switch Lite and I've been playing a lot of games @30 fps. And I own a good gaming PC.
30 fps is noticeable at first but that's about it. Calling it 'unplayable' is just pure hyperbole.
Most of you are overreacting.
But this idea is that higher frame rates are getting in the way of "ambitious game design" is just false for the vast majority of games. But yeah, if a game dev comes out and says they are making a physics based game and due to CPU limitations 60 fps is not possible then I have no problem with that. But it needs to be something more than just lowering fidelity, imo, to take away that option. Lowering fidelity is merely adjusting settings and resolution to get higher frames. Stuff PC gamers do all the time. On consoles, performance mode is just a preset of those same exact settings.
Wait what?30 fps with Samsung's game motion plus is also tolerable.
60? 30 was a pipe dream. Those Voodoo 1 cards were the black magic that made it possible. Before that 15-20 fps was our "smooth" game play.I'm fine with a quality/perf mode because some games don't really need 30fps. It's like the myth that most old 3D games were 60fps. Bunch of liars lol, I grew up with early 3D and 60fps was a pipe dream.
Not native, interpolated. It creates some bothersome issues when you're too used to native motion.Yeah, I get a bit lost on how it's rated, but something like this on rtings, they score 9.9 for motion scores on basically every category. I don't think that's unique either, I imagine most of the 4k 240hz monitors score incredible on motion there. "The CAD at 120Hz is outstanding. Pixels transition to their target RGB level almost instantly, so there isn't any blur trail or noticeable inverse ghosting" But I wouldn't be surprised if it's still inferior to Plasma.
I feel you.I loved my Kuro until I decided I wanted HDR/Dolby Vision haha.
It adds very little lag yes but nowhere near as normal motion plus does. I'm still able to pull off timely parries and dodges in Sekiro/Souls/Bayonetta games.Wait what?
Game + motion plus? In that, it doesn't add input lag?
I really think it is. I recently bought a Switch Lite and I've been playing a lot of games @30 fps. And I own a good gaming PC.
30 fps is noticeable at first but that's about it. Calling it 'unplayable' is just pure hyperbole.
Most of you are overreacting.
Ah, good times... four player split screen couch slaps, throwing knifes and watch bombs. Played for years with my friends. Also Turok 2.You have not lived unless you played 4 player slaps only Goldeneye at 12fps
I don't think anyone is disputing that if a game can run at 60fps on console just by lowering fidelity that it should also offer a performance mode. Very little reason not to in those cases.
The problem is people like the OP declaring it should be mandated for ALL games. Because, as has been said it puts limits on how ambitious games can be. A lot of PC gamers don't seem to realise the reason they can get 60FPS on just about every game is because most games are targeting consoles, so their high end rigs have a lot of untapped power not being used.
Naw, the motion clarity of 60 fps is still way too blurry.Should be the maximum too, anything after 60 is diminishing returns, just put all resources on visuals.
Is it not the amazing 1 in 100 games we all want to play and not the vast majority?But this idea is that higher frame rates are getting in the way of "ambitious game design" is just false for the vast majority of games. But yeah, if a game dev comes out and says they are making a physics based game and due to CPU limitations 60 fps is not possible then I have no problem with that. But it needs to be something more than just lowering fidelity, imo, to take away that option. Lowering fidelity is merely adjusting settings and resolution to get higher frames. Stuff PC gamers do all the time. On consoles, performance mode is just a preset of those same exact settings.
You can too. Buy a PC and you can increase the fidelity to whatever level you want, frame rates be damned.
I agree with you that it shouldn't be dictated either way. Devs should choose, but since 3/4 of gamers choose higher performance, I think the market has already made it clear what they want.
All I'm saying is people should have the option to choose for most games.
..on our LCD monitors.Naw, the motion clarity of 60 fps is still way too blurry.
Welcome to DoubleClutches TED talk everyone.
Not quite sure who you are talking to though.
It's funny you think I'am a graphics whore lol. I'am really not I could easily afford a PS5 pro or gaming PC but I'am too tight fisted for Pro and too much of an idiot to make a Gaming PC!
Naw, the motion clarity of 60 fps is still way too blurry.
Blur Busters TestUFO Motion Tests. Benchmark for monitors & displays.
Blur Busters UFO Motion Tests with ghosting test, 30fps vs 60fps vs 120hz vs 144hz vs 240hz, PWM test, motion blur test, judder test, benchmarks, and more.www.testufo.com
At 240 fps it starts to become normal.
30-60fps is by far the biggest difference.I have a 165hz monitor and at least to me anything past 60fps doesn't improve fluidity that much, not nearly to the same extent of 30 to 60, of course on PC you have unlimited power and can push beyond that, but on consoles i just find a waste of resources.
30-60fps is by far the biggest difference.
60-120fps is noticeable.
60fps should absolutely be standard going forward.
this 60fps is enough for me.i dont feel sorry for people who game on 120fps above.
You are basically setting the standard too high , not worth it.
And what percentage of Steam users have 120hz monitors?After playing the latest and greatest 60fps should be a minimum 120 the standard and 240hz the best seriously if you say I’m crazy you have not played on a 240hz OLED screen it’s just glorious!!!! I cannot even imagine how smooth a 360hz monitor and higher would look in person.
Even if the standard is 20fps we would get 720p games. Most developers optimize their games to the minimum possible to launch the game faster. It’s simply the most developers who began to develop their games 4 years ago didn't expect 60fps being a standard today.It could be 100% of players.
What i said is still fact. Framerate requires everything else to be dumbed down. And games run at sub 720p in 2024.
Garbage
First paycheck I got was a Voodoo. Terrific card.60? 30 was a pipe dream. Those Voodoo 1 cards were the black magic that made it possible. Before that 15-20 fps was our "smooth" game play.
Yeah this gen should've take the hit and make all games play smoothly at 60 fps.. and then upgrade the graphics from there in next gens...After playing the latest and greatest 60fps should be a minimum 120 the standard and 240hz the best seriously if you say I’m crazy you have not played on a 240hz OLED screen it’s just glorious!!!! I cannot even imagine how smooth a 360hz monitor and higher would look in person.
Yeah. Lets sacrifice every single aspect of our game - Lets make everything objectively worse. Lets cut NPC counts in half. Lets lower shadow quality, lower texture quality, eliminate ray tracing completely. Sounds great! While we're at it, lets drop the fucking resolution down to 540p in 2024, who cares if it looks like smeared Vaseline all over the TV screen. 300fps baby!
Lol
Framerate Warriors man, you've completely ruined this generation with your absolute nonsense.
JUST BUY A PC