Admiral Woofington
Member
The Hobbit in 60 or w/e its fps was hurt my eyes. It sucked. Battle scenes were too fast looking. So personally 60 fps does not give the cinematic feel for me.
It's like old b&w blurry pictures vs 40MP digital shots.
Sometimes they have a charm of their own and you don't need perfect fidelity.
A lot of movies are still shot on film instead of digitally because film has a certain "look" to it that is lost when shot completely digital.
120 fps or bust you peasants.
When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game. It's a constant reminder that I'm playing a video game and breaks the immersion. Maybe that's what you're feeling.
I also hate that motion flow crap they put in TVs these days. Makes every show look like it was filmed on a consumer camcorder.
There are 4 kinds of people:
1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.
As opposed to thinking your watching a movie? What's more immersive about that?When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game. It's a constant reminder that I'm playing a video game and breaks the immersion. Maybe that's what you're feeling.
Well, there are a lot of people, myself included, that so far *have* to content themselves with 60 fps screens because a 120hz panel is a luxury they don't necessarily can afford.I think people who really cared about framerate should not settle for less than 120fps.
There are 4 kinds of people:
1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.
60fps Uncharted 3 video they released felt very strange, I don't know why.
60fps to me has always been a positive thing as far as I can remember.
The Hobbit in 60 or w/e its fps was hurt my eyes. It sucked. Battle scenes were too fast looking. So personally 60 fps does not give the cinematic feel for me.
Grow up.
With the much more accurate control from a mouse it is very easy to feel the higher input latency of lower framerates!
When playing a 60fps game (no matter how good the graphics are) it reminds me of playing an arcade game.
Exactly! we love video games!It's a constant reminder that I'm playing a video game...
C-C-C-C-Combo Breaker! And here was I thinking you were actually complimenting arcade-quality framerates and gameplay. Different strokes for different folks I guess....and breaks the immersion.
I also hate that motion flow crap they put in TVs these days. Makes every show look like it was filmed on a consumer camcorder.
There are 4 kinds of people:
1. Those who won't play anything less than 60fps (PC snobs)
2. Those who love 30fps for it's consistency and cinematic look
3. Those who love unlocked frame rate, even it if means it's only 60fps when looking up at the sky.
4. Those who prefer a stable 60fps, but would rather take a capped 30fps over an unstable 30-60 frame rate.
It's stockholm syndrome, people have been stuck with 30fps and below so long that they're rationalizing reasons to keep it around.
Well it wasnt born this generation. First Ive heard of it was with Shadow of Collossus HD, some people agreed with the developer when they said 60fps would look messed.Go back 4 years and try to find threads on GAF from people claiming lower framerates are better, or that 60FPS is this bad thing thats not 'cinematic'. You won't find anything and if you do, its rare. It was born during this generation transition when certain games fell short of what gamers wanted out of their new consoles (1080p 60fps).
I will agree that controlling a 30fps game with a mouse is horrendous. However, I prefer using controllers and 30fps on a controller feels fine.
Also, I think that the reason 60fps is hated by some is because 60fps can be associated with arcade games and I also think 60fps shows TOO MUCH and the game loses some of it's atmosphere because of it.
When watching a frantic fight scene in a movie at 60fps, it just looks like behind the scenes footage of a couple of actors and it looks kind of like fake wresling, but when the same scene is played back at 24fps with motion blur, it can often look disorienting (on purpose) and add to the look and feel of the fight. At a slower frame rate it's harder to notice mistakes or the fact that it's fake fighting.
I'm kind of with those people who think 60fps looks too fast and animations look un-natural.
We'll see soon if the 60fps version of the Last of Us improves the game or if people find it odd looking.
Oh, one more thing about 60fps... whenever I see a game running at 60fps, I think to myself that the developers have not pushed the graphics far enough. If the engine has that much overhead, I just wonder why the developers didn't use that extra overhead to provide better physics, lighting, particle effects, animations, etc.
Those are my opinions. I know some people may feel like I'm an idiot or that I'm just conditioned to 30fps or what not, but I don't care. I know what I like and I know what looks good to me. Just like a game's color pallet can add to the mood or atmosphere, so can the frame rate.
No, that's not a fact, just your opinion.confirmed.
stop trying to convince everyone then that 30fps is better or anything above breaks immersion as you are massively wrong. You bizarrely prefer 30fps, we get it, but it being better in any shape or form, is wrong, not opinion but fact.
This is pointless. The 60fps crowd will never simply accept that someone might genuinely like the look of motion blurred 30fps better in some cases.
" 30fps IS more filmic than 60fps " was NEVER proven wrong. It can't be proven wrong.
While you can't obtain a motion blur the same quality of that captured on film, 30fps+mb will always look a lot closer to what [our brain is conditioned to think] a movie looks like than 60fps.
There's no way around this.
Whether this is because our brain is conditioned to automatically classify 24fps as film and 60fps as cheap camcorder footage, that's another story, but the point still stands.
I am a videomaker, i do video and motion graphics for a living.
If i shot 48 fps footage with my camera most people have the impression it doesn't "look like a movie" for some reason (most won't be able to tell why)
Simply halving the framerate will do the trick: people will think that the same footage now "looks like film".
And when i do motion graphics in After Effects or some 3d rendering software i can export to whichever framerate i like. And i sometimes choose 25 fps with motion blur, because in a lot of cases i just prefer that look. Plain and simple.
It is, in other words, my artistic choice, and as someone who does motion graphics and often goes for 25fps i completely understand why 30fps might honestly be a purely aesthetic choice.
Of course there are benefits coming from that and from the 800p resolution, but that's beyond the point.
There are other factors working in favour of 30fps.
60fps will in fact look closer to real life.
Because of that and because of the increased fidelity, your eyes are more likely to go searching for inaccuracies and inconsistencies.
That's why watching the Hobbit, I'd often snap out of the illusion and had to try really hard to convince myself i wasn't looking at a bunch of weird guys wearing funny costumes and make up moving back and forth on a movie set.
60fps improve depth perception; everything appears sharper. The shape of things and objects onscreen becomes easier to read, 3d models acquire a much more 'solid' presence. Which is good in many ways, but also makes it much easier to pick inaccuracies, polygonal edges and flat surfaces.
To someone these might all sound like positives in a game, i'm not arguing that.
Just accept that not everyone looks for the same things.
Sometimes, 30fps help the illusion; it's a screen that helps hiding fine details that immediately tell you what you're looking at is fake and it sometimes works better for games that go for a movie look or photorealism.
Well, the idea is that games are illusions and that breaking that illusion can affect the immersion.Being unable to see what's going on makes for more immersion? Okay.
M°°nblade;114381244 said:Well, the idea is that games are illusions and that breaking that illusion can affect the immersion.
When you play a game like Dark souls and crank up the brightness for gameplay advantages, you see more but you break the immersion.
I'm playing Bioshock infinite (PS3) at the moment and everything seems to be deliberately covered in vaseline lighteffects, just to create an immersion.
Just like a higher framerate, a higher resolution is something that's considered to be absolutely superior as well, yet it can make 2D trees in vanilla Oblivion and pixelated bitmap skyboxes in Halo PC visible.
Ofcourse, there will be instances where higher framerate of resolution do increase the immersion. So it's an argument that goes both ways.
So then why do you say it applies to movies but not games?i'm not quite sure you know what immersion means. You can immerse yourself into anything.
Games aren't real, so of course they are illusions.games are not illusions, why the hell are people convincing themselves this?
M°°nblade;114383503 said:Games aren't real, so of course they are illusions.
Electrons aren't real?
M°°nblade;114383503 said:So then why do you say it applies to movies but not games?
As real as movie sets and blue screens!Electrons aren't real?
Action in low fps film has always been an awful, blurry mess. More frames per second can only help correct that.
The cinematic argument loses all merit when games do not simulate your perspective as it were a camera. Games lack every aspect that makes movies at 24/30fps feel more cinematic. First and foremost proper motion blur, but also things like lighting and lenses. Games do not simulate a camera, and to extract the fps as the single defining thing as to what gives movies a cinematic look is some proper denial. If you remove half the frames of a soap opera, it is still going to look like a soap opera due to production
Because games have no concept of exposure time!
games are not illusions, why the hell are people convincing themselves this?
Yes, technology more than a decade old isn't up to modern standards.The question is what framerate would Metroid Prime and F-Zero be running at if you bumped the resolution up to 1080p native?
The Gamecube natively displayed games at 480p standard def at 4:3 ratio.