What's "needed" is arguable and it's subject to individual standards.Great post.
Plus, there are actuallu 'many' games that are NOT SHOOTERS and doesnt need for that extra responsiveness. For the record.
Yes they were. They were designed with 30fps and console limitations in mind. I dunno how anyone actually can refute this. They werent designed with high spec systems in mind and then downgraded to consoles. If these games were not 30fps and instead 60, these games would look worse than they did at 30fps, on console or pc, period. They aren't gonna add extra textures, extra polygons and better character models, bigger geometry, smarter AI and what other improvements you got on the pc port. All of the pc improvements of these games come down to: higher resolution, higher fps, better and sharper details of the details that already exists in some form in the low end (consoles).
I played Spec Ops the line on the highest settings on pc, it doesn't look next gen, it looks like a PS360 game, same engine, except that it ran in 1440p on my monitor.
What's "needed" is arguable and it's subject to individual standards.
And yet there isn't a single genre, not even turn based games, that makes having 60 fps detrimental to the experience and less appealing visually, so I don't see the value of your argument.
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
Yeah, I definitely call bullshit on this.Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
Were those games 60fps when they first came out? I enjoyed them on consoles and never missed those extra 30 frames, I beat 30fps multiple times and they offered me a lot of playability, overall in my gaming life I've probably completed 10x as many 30fps games as 60.
Is this some stealth pc masta raze!!! comment? How can you take the best of everything when sacrifices have to be made between IQ and fps?
the whole chain of thought came from apologists and game websites with a bias for Microsoft, for the xbone version of tomb raider.
Making your game 60 frames does make it less visually appealing. In fact, all things equal, it would look half as visually appealing as it would at 30 frames.
I'm not refuting that at all. You quoted me seeing these games *weren't* designed for 60 fps.
They still all play better that way. The trade offs you make for 60 fps aren't in playability. I'm really not sure where such a suggestion comes from. This generation we're going to start seeing third person games running at 60 fps in consoles more often I'd guess, because the visual concessions to hit 60 are no longer what they were and more developers are going to think it's worth making those concessions for the sake of playability.
I haven't heard The Evil Within's framerate confirmed, but it's more than likely 60 given it's engine. Plants vs Zombies Garden Warfare was 60. The Last of Us HD will be 60. Tomb Raider wasn't there, but it was close.
You're likely talking about developers of two of the most cinematic third person games of the last two generations both choosing 60 fps for a cinematic project in the here and now.
Presuming that it's 60 fps (which obviously it might not be) Has The Enemy Within done away with RE4's style of enemy and combat? Is The Last of Us going to turn into an inferior product for running at 60 fps?
No and no. Again, presuming Evil Within is 60.
30 fps is a concession made to achieve other goals. I'm not sure that there is really a game developer out there that chooses 30 for the sake of 30.
"Hey guyz, let's just never change anything"-the-post.
Reading comprehension please. Developers choose 30 fps for the sake of other things. No developer is choosing 30 fps because they want 30 fps. They're choosing it because they don't want to concede other things.On limited hardware, yes, yes there are plenty who deliberately shoot for 30 frames. The Evil Within is 60 frames because the developers chose that 60 frames best met their goals for what they wanted the game to be. Consoles are fixed hardware, which means you will always have to make compromises and decisions that you feel best suit your game. 60fps brings it's own compromises, just like 30 fps brings it's own set of compromises. It's all about which compromises you're willing to sacrifice and those which you're not. The Evil Within being 60 frames, doesn't refute that Mikami may want his next to be 30 frames, if it means 30 frames allows him to meet his goals for what he wants his next game to be.
I believe you're coming at this argument as "all things equal, 60 frames is superior" and yes I would agree with you, but all things aren't equal. Sacrifices have to be made to achieve 60 frames, and sacrifices also have to be made if you're shooting for 30. It's not a "one size fits all" argument and there is no definitive answer. It depends on what game you're making for the specific hardware. TLOU being 60 frames on ps4, doesn't negate that the TLOU is a better game on ps3 hardware when designed around 30 frames.
On limited hardware, yes, yes there are plenty who deliberately shoot for 30 frames.
To be fair to him in one regard, I do understand the argument that framerate CAN be a choice, just as we see people purposefully using shitty video quality in films or black and white, or what have you... to evoke a sense of time, or to produce the feel of a certain type of entertainment (Rocky Balboa for example switches from film to video for it's boxing match finale)...
But most everything moves on, because the vast majority of black and white films weren't saying anything by being black and white and just wanted to evoke reality as much as possible.
The same is true with framerate. Right now we're at the point in time where shooting and projecting high frame rate is expensive, and your choices for going to see it are limited. I know there are a group of people that hate 3D, but it's standard now for certain types of releases, and audiences are holding steady. The expectation is there now that next years super hero movies will be in 3D, but it took decades of sporadic 3D hits and failures before it reached the point where audiences expect it.
But in time, audiences are going to expect visual fidelity and framerate, to get better. And as they get used to it, their minimum standards are going to change.
Just the same way we'd shit on a game made for PS4 that looked and ran like Resident Evil 4 today. And we'd be right to do it. If the latest big budget movie came out and it randomly looked like Technicolor we'd all criticize it.
People use Hobbit as an example of higher frame rates being a bad thing, NOT because of the higher frame rate itself, but because it makes it easier to notice things that go unnoticed in slower frame rates. You can notice the make-up, the sets and everything and it breaks the illusion. Again: NOT because higher frame rates are a bad thing. They are not, they give you more information, otherwise you wouldn't be able to notice this things.
What should happen in this case, is that studios and directors should adapt, or change the way they make these things or shoot their films, so you can't notice them in higher frame rates, to keep the illusion going. Than people would be able to appreciate higher frame rates, because it makes scenes and everything going on on the screen so much clearer, specially in fast action scenes or when there is fast camera movement.
At best it's not the only reason, even with way, way more powerful hardware I suspect Ready at Dawn or Crytek would still go "ok, so what can we pull off while still being 30 fps, roughly at least?" If you could somehow create hardware that magically had no limits or more realistically had limits but could not be reasonably attained, they might still force 30 fps but if they had sense would make it an option so that "the feel" could be there for those that want it while those that DON'T can have the game unrestrained.Yes, they shoot for 30fps, but its always for a reason; not because "30fps is more fitting for this game because it is."
Funny you mention 3d movies. They are horrible blurry at low framerates, almost unwatchable.
It's time for change and that we are frequently discussing on that matter here at GAF makes me want to believe that at least "something" is moving forward. Hopefully we won't see any triple A games with shitty framerates anymore. Think about GTA V and TLOU, arguably the biggest/most important titles last gen, and both had a super shitty, almost 24fps like framerate Seriously, where is that supposed to lead? Where is the "stop"? Every dev says "30FPS" but they are consistently going below that just for better graphics and people STILL buy that. The worst part of that: I bought TLOU on Day 1 as a ND "fan" and a fan of the setting and I DID NOT KNOW that the framerate would be THAT shitty albeit I READ a ton of reviews!!! This industry is somewhat fucked with their priorities and if it's loosing steam then for the right reasons, which is consistently putting VIDEO over GAME. That's not how it works, I believe. Temporarily, for a certain crowd, maybe. But it's not a valuable long term strategy to weaken your own strong points, which is interaction/being interactive.
It's not universally true of everything in the film either though. The CG looks more believable at high frame rates to me. During the riddles in the dark sequence Gollum is far more convincing at 48 fps. Furthermore everything that's real looks more real. It's not just concessions.
It's a learning experience. I think it was unfair to expect the first big theatrical HFR film to get everything right.
And both of The Hobbit movies were still cinematic when exhibited at HFR. You could argue that they were less cinematic, perhaps, but I've never heard anyone saying they weren't cinematic. That certain things looked more obviously fake? Sure, but uncinematic? Crazy.
I'm really starting to tire of both extreme sides to this well-trodden debate. 30fps looks more filmic, period. If that is something that the developers feel is important to capture, then using 30fps can be a stylistic choice. It also allows them to pump out more detail and effects given the extra rendering headroom.
60fps is objectively better with regard to motion clarity and responsiveness, there's no denying that. But it's not like 30fps is an unplayable mess if it's locked and stable.
Extreme opinions on this tired subject can go fly a kite. Both framerates have their stylistic advantages, and the developers are free to choose whichever they'd like to convey the look they're going for.
Now sub-30fps drops is another story altogether. That just shouldn't happen.
What strikes me about this argument is that even conceding "24-to-30fps look more filmic", I'm genuinely not sure why "looking more filmic" is something we should praise and embrace.Look, if we're going with the notion that 24 fps looks filmic *because films* which I think personally is absurd obviously,
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.
Reading comprehension please. Developers choose 30 fps for the sake of other things. No developer is choosing 30 fps because they want 30 fps. They're choosing it because they don't want to concede other things.
Developers that choose 60, do so because they want 60.
Yes, they shoot for 30fps, but its always for a reason; not because "30fps is more fitting for this game because it is."
It reallly does. Makes 24 look really bad in comparison. Even more so then live action stuff. The exaggerations in animation and whatnot are given such a boost.Absolutely true. Forgot to mention, but yes, during CGs it is incredibly better, and even more believable. Animations films should all be released in 48 fps from now on. Makes a lot of difference.
I think you are being sarcastic but: Don't worry, you got a lot of brothers here on GAF.
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.
Oh no, I'm serious. Feels like I'm controlling a monkey who just broke out of the mountain dew nuka-cola concentrate vault.
You talk like that's a bad thing.Oh no, I'm serious. Feels like I'm controlling a monkey who just broke out of the mountain dew nuka-cola concentrate vault.
What??? The whole reason it is 30 FPS is itssacrificing 60 FPS for higher graphical fidelity.
Some people...
Can we please stop saying the sky is blue because I dont want it to be?
Actually, I'd prefer we stop with the "can we stop with the whole '60fps is not cinematic'" lines. OP's is especially embarrassing. Do you even KNOW what cinematic means? It certainly doesn't mean "more life-like". It means cinema-like, meaning film-like, meaning lower framerate w/motion blur. 60fps is NOT cinematic because its motion looks even far less like cinema than 30fps does. What you're saying is you don't WANT games to look more cinematic; you want them to look more life-like. And that is a whole other matter entirely.
Are you seriously trying to ask this to catch somebody? Why on earth would the developer not take advantage of the extra resources it frees up, regardless of whether they chose 30fps for its more cinematic look?
I'm not sure you understand my question. It basically boiled down to could The Order still push for it's filmic look at 60 fps with the same amount of graphics , or does 60 fps automatically cripple it from looking "cinematic " or "filmic" . Instead of debating through sarcasm try reading the post and debating through what was written.
No, you would only claim motion sickness if the 60fps is a result of doubling frames.
It doesn't. Your brain just isn't used to 60fps video games.Throwing in my opinion.
30fps is fine. If you have a locked 30 it is a dream to play as long as it is fine tuned for that framerate. Last of Us was ussually around 20fps which IS a bit jarring. But I absolutely think making it 60fps will take away from its feel because 60 DOES make things appear faster, unless compensated and slowed down.
Most games benefit with 60fps, but there are exceptions. Much rather developer target higherres and locked 30 than stripping it down to hit 60.
I have to imagine in a scenario where it had the same exact graphical fidelity... that's really just down to personal preference. It may not be what THEY consider "filmic" but those who don't think that undermines films feeling like films with Hobbit HFR or motion interpolation on TVs would probably still feel it looks filmic.I'm not sure you understand my question. It basically boiled down to could The Order still push for it's filmic look at 60 fps with the same amount of graphics , or does 60 fps automatically cripple it from looking "cinematic " or "filmic" . Instead of debating through sarcasm try reading the post and debating through what was written.
what? :/Why is this an argument? You can crank all the settings to ultra and still get 60fps when playing on pc, having worse performance in the game does not make it "more cinematic"
Worst post today? I think so.
60fps will always be better than 30fps. Is a fluctuating 60ish better than a locked 30? That's more complicated
I do not like playing 60fps games. 60 FPS simply looks artificial, it breaks immersion and controlling characters imparts a sense of jerkiness, spazocity, and a general feeling that I'm controlling a tweaker. Why? Not sure. Probably because I;m so use to 24fps movies.