AC: Unity's devs: 60FPS doesn't look real and is less cinematic, 30FPS feels better

Not sure why people use the term "cinematic" when justifying a game that is 30 frames per second.

It's not hard to make a case for going for a game being 30 frames per second without using things like cinematic, just say you went for 30 fps over 60 as you wanted to use the resources to add more features like higher resolution textures, bigger draw distances, larger worlds, more complex A.I./physics/effects or whatever reducing the frame rate allows you to do.

I would often prefer 60 fps over 30 fps but I can understand developers dropping down to 30 for a lot of reasons (and I generally don't have issues with it being 30) but being cinematic doesn't seem very justifiable to me.
 
Film fps ≠ games fps in various ways (live rendering, player agency), but I think that's already clear. Someone wrote an article on FPS perception and a little of the history behind the motion picture standard over on OCN. Worth a read.
 
yHIK9IW.gif


...

Preorde%20canned.gif
 
It isn't hard. The problem is the lower quality graphics they would have then. More or less: twice the framerate, half the graphics quality. It's almost directly proportional.

This is just not true! Reducing shadows from ultra to high and some other things will impact the framerate positively while only having a minor impact on IQ. But like another one said you cannot show framerate in pics and youtube only starts to give access to 60 fps video.
 
Honest question:

Isn't it really difficult to achieve a solid, locked 60fps on consoles?

Isn't there a huge balancing act they have to go through where sacrifices need to be made in other areas of the game?

I seriously have no real knowledge of how game developers achieve 60fps. Me, I'm just happy when I don't notice any juddering dips in frame rate, and especially when there's no screen tearing or pop-in.

The problem in going from 30 fps locked to 60 fps locked is that there's always going to be things that take a fixed amount of time per frame. An entirely made up example: if you have 5ms worth of things that are "fixed time", that means at 30 fps (33.3 ms/frame) you have 28.3 ms to do all the other stuff, while at 60 fps (16.7 ms/frame) you only have 11.7 ms. So the stuff that can be made faster actually needs to run more than twice as fast. This is Amhdahl's law in effect.

Yes, it's difficult, and you'll need to make sacrifices to achieve that. It's up to the developers to decide if those sacrifices are worth it (as a PC gamer I will always drop graphics settings to achieve 60 fps).

I really like the Hobbit reference, they know people, even the ones who hasn't watched it, have heard about the polemic of high framerate of the movie and how it can be feel fake, that serves to plant doubt in their mind, so they think "maybe they are right, I also heard about that."
The best thing about that is that the reason people don't like the way Hobbit looks is because it looks too real. It looks weird because it's closer to watching a theater play. Somehow Ubisoft twisted this into "30 fps looks more real".
 
Hmmmm, I am not sure about that. I heard it was a technological limitation that allowed them to display at 24, and that is why it became the standard.
You're wrong about stop motion. It actually only has to be above 9 fps to fully qualify as a film, though this is obviously at its most rudimentary point.

It may qualify as a film due to definitions or whatever reason.
http://www.youtube.com/watch?v=7EluIdiuUuA
This however is not fluid in moition. Of course it's not real "stop motion" which is a method to animate lifeless objects. I did a stop motion movie once. Fun but veeeeeeeery tiring. Took forever to do a 15sec animation.

From WIkipedia:
"Thomas Edison said that 46 frames per second was the minimum need by the visual cortex: "Anything less will strain the eye."In the mid to late 1920s, the frame rate for silent films increased to between 20 and 26 FPS."

When sound film was introduced in 1926, variations in film speed were no longer tolerated as the human ear is more sensitive to changes in audio frequency. Many theaters had shown silent films at 22 to 26 FPS which is why 24 FPS was chosen for sound. From 1927 to 1930, as various studios updated equipment, the rate of 24 FPS became standard for 35 mm sound film. At 24 FPS the film travels through the projector at a rate of 456 millimetres (18.0 in) per second. This allowed for simple two-blade shutters to give a projected series of images at 48 per second, satisfying Edison's recommendation."

So okay. It was a mixture of taking a Edison recommendation and working around it with the technical limitations of the time. Other sources also talk of the trade off between quality and film cost.

There has been lots of cases during the history of cinema where people rejected the idea of new technology. A lot of people where heavily opposed to film when they inclouded sound, or "talkies" as they where called. People thought that the cinematic experience would be ruined if they had to deal with these extra senses.
A lot of great actors and movie makers didn't managed to make the transition from silent to talkies either. Today, most people only know of Buster Keaton and Charlie Chaplin. Most other guys died with the silent films as it was almost a different category and way of telling stories by itself.
To a lesser degree the same held true with color film.

But what sound and color added to film was that they added in creating new tools for storytellers to tell stories in different ways, and more effectively. That made them an easy sell, and once they where in, there was really no going back to the old way. It proved to effective, too useful. Not using them would almost be limiting.

Maybe because not all actors are memorable? Just like you don't remember every actor from the 70s, every nfl player in history...? But the ones you mentioned are still remembered and beloved. So maybe it is because they were good and not the new technology? Good actors can adapt.

60 FPS is a different thing because you're talking about habits. There is not much to gain from new storytelling techniques by pushing 60 fps in movies over 24. That's the key difference here. It has remained the standard because it's what people are used to

There are not many movies made in 48/60fps. But I know one thing. Watching the Hobbit was a great experience because you had no blur during camera panning. So you could admire the beautiful set pieces even more. Also action sequences filmed with a dolly cam where much easier to follow and not just a hectic and blurry mess. I am talking about the barrel scene in the second movie here. I think something like that is a very positive effect for people who watch the movie. It was very hard to follow to blurry, fast cut battles of LOTR without getting dizzy.

Finally you are completely wrong about cinema having nothing to do with video games. Almost every good non-interactive video game story is completely modeled in cinematography, use of voice, directing, sound, lighting, blocking and so and so forth, after the way the do things in film. For the longest time it has been what a lot of developers have strived to do. To make video game stories with production values at the level of film.

It seems you are talking about cutscenes ... but I don't play cutscenes. I have to control the gameplay. I have no problem with cutscenes being pictures ,24fps with black bars or just plain text (Max Payne 1 being a glorious example of great storytelling with still frames).

But in the end I have 10+ hours of gameplay. And this is what matters. Here I have to control the character from a fixed angle (behind the shoulder or even first person). I control the camera. I have to make split second decisions and coordinate eye and hand. Playing a game at 30fps with a mouse is terrible. This is not a movie where you lean back and watch it.

And one last thing regarding the cutscenes: Even if the games have production values of film. Stories of this games often remember mindless blockbusters like Transformers. I don't see many games with high production values and great stories, memorable characters and stuff.

Well this was a long post^^ But just wanted to say how I see it.

If there are mistakes in grammar or spelling. I'm sorry, but with english being my third language and the schooltime long forgotten ... you get the drill
 
Untrue. 100% untrue. this is some high level ridiculousness. these people are horrible at spin; this is beyond belief wacky.
 
Not sure why people use the term "cinematic" when justifying a game that is 30 frames per second.

It's not hard to make a case for going for a game being 30 frames per second without using things like cinematic, just say you went for 30 fps over 60 as you wanted to use the resources to add more features like higher resolution textures, bigger draw distances, larger worlds, more complex A.I./physics/effects or whatever reducing the frame rate allows you to do.

I would often prefer 60 fps over 30 fps but I can understand developers dropping down to 30 for a lot of reasons (and I generally don't have issues with it being 30) but being cinematic doesn't seem very justifiable to me.
Exactly. Making the case for 30 fps isnt hard. We understand why developers do it. Especially for a game like this.
But don't pretend you are giving us the best of both worlds at 30 fps.
 
Since 30 fps is better than 60 fps, i will not understand if the PC version is not capped at 30 by default.

edit :
So not only are you artificially limiting the resolution of the PS4 version down to 900p to "avoid debates", now your locking the PC version to 30fps because it 'feels better' and 'more cinematic'? Fuck, that's some EA level shit.

wait, is it actually happening ?
 
Why even say that? Just put out the game and say something like

"We wanted to make it as beautiful and have as many people on screen at one, so 30fps"

Casual get noting out this information and forums like GAF explode with each statement. Seriously, why do they even say these things :O Just staying quiet is much better.
 
I could cry, once again more amunition for the 30 fps is fine people. It is like they want to spread misinformation and lies to ensure conflict and fanboyism. It is really sad.

There is a large difference between. "30fps is fine for many types of games and an acceptable standard depending on the scope of each specific game when tradeoffs between scale and performance are necessary" and "30fps feels better, because its more cinematic"

Any PC gamer will say that the second point is a damn lie. Hell, any console gamer would tell you that. Try and play a fighting game like SF or Tekken at 30fps. Try and play a twitch shooter like COD at 30fps. Try and play Bayonetta on PS3, yeah no.
 
If you play a game at 30fps for a few hours and then crank it up to 60 fps, it kinda looks weird and speed up right away but you get used to it pretty fast.
If you do it the other way around and play it at 60 fps for a few hours and turn it down to 30 fps it straight out looks effing broken,
30 fps does NOT look more real ubisoft!
Sure I can play a game at 30 fps, but god damn this pr spin bullshit needs to stop, 60 fps will always be better!

But hey you can always just buy it on pc and play it at 60 fps right? If black flag is anything to go by your gonna need two 980 gtx in SLI and it will probably stutter like hell even then.
The whole watchdogs thing and now this, I think I am already done with ubisoft this generation...
 
It also lets us push the limits of everything to the maximum.

To me that seems to be the main point, sacrificing frame rate for better physics or effects is fine (as long as it's a steady 30fps).

But skirting around it with this cinematic nonsense is asking for trouble.
 
I don't care much for the 30 fps vs. 60 fps debate as I'll take either or and be happy with it.

However I do raise and eyebrow when I hear bullshit reasons for not having 60 fps. Just say you didn't want to do it like that. Don't give some bullshit reasoning that nobody will buy.
 
You know... I don't particularly MIND 30 fps, I can deal with it, and it may even be possible to convince me that the trade off in a particular game could be worth it...

but don't tell me 30 fps on its own is better than 60 fps. that's such a stupid thing to say.

Ubisoft batting 100 this week.
 
They are really firing on all cylinders with this game aren't they. These past few days have provided more entertainment than the game probably will.
 
How on earth can you say such a thing. And keep ratchet and clank out of it. Devs stop going 60fps because they didn't receive any recognition for going 60. Not the "it's not cinematic" nonsense. Ridiculous.
 
It actually feels better for people when it's at that 30fps.

Thank you for speaking on behalf of everyone.

It's like when people start asking about resolution. Is it the number of the quality of the pixels that you want? If the game looks gorgeous, who cares about the number?

ACVI - 240p edition: It looks gorgeous
 
Since 30 fps is better than 60 fps, i will not understand if the PC version is not capped at 30 by default.

edit :


wait, is it actually happening ?

I don't know if its happening or not, but their dumbass defenses make it seem like it would not be surprising if they did. After all they already did artificially cap one SKU with a terrible explanation.
 
There is a large difference between. "30fps is fine for many types of games and an acceptable standard depending on the scope of each specific game when tradeoffs between scale and performance are necessary" and "30fps feels better, because its more cinematic"

Any PC gamer will say that the second point is a damn lie. Hell, any console gamer would tell you that. Try and play a fighting game like SF or Tekken at 30fps. Try and play a twitch shooter like COD at 30fps. Try and play Bayonetta on PS3, yeah no.

So which of these quotes is responsible for Dead Rising 3 PC 30fps lock, or The evil within 30 fps lock?

Edit: Do not forget The Crew 30 fps lock on PC ;)
 
Mon dieu...why won't you just keep your trap shut, Ubisoft?

If they keep quiet, you don't find out their incompetence until AFTER you have given them money.

At least this way you have no excuse and if you still pay up its your own fault for ignoring all the warning signs.
 
ubisoft please.
30fps is easy to explain, and most people understand. Please don't even start throwing the word "cinematic" everywhere, it just backfires.
 
A ridicolous arguement. The film companies agreed on 24 fps eighty years for cost reasons, not because of the aesthetics. It was enough to make it look like fluid.

People are used to this standard, that's why The Hobbit looks strange. If you have always eaten yoghurt with strawberry taste, then it would feel strange to eat a yoghurt with real strawberries.

If you would let your child play only 60 fps games and watch only 60 fps televsion, it would certainly not appreciate the cinematic look when you go to the cinema.

tldr: People are used to bad things, but that doesn't make them better.
 
60FPS doesn't look real and less cinematic?

Its a game and games need fast reactions (especially with action game like AC:Unity), why does a player controlled action game need to have 30FPS output as feedback?

The dev's can't make the game run at 30FPS and try to justify it? If the game doesn't run, just mention "it doesn't" with real reasons. Trying to find a loophole by referencing something else isn't the appriopriate way, loosing respect and what not.
 
To me that seems to be the main point, sacrificing frame rate for better physics or effects is fine (as long as it's a steady 30fps).

But skirting around it with this cinematic nonsense is asking for trouble.

Yep shame they have both consoles at 900p so there is no debate and stuff. So they aren't pushing to make things better really.
 
After The Last of Us Remastered, I no longer believe 60fps feels 'less cinematic'. A well-directed game feels cinematic, something AC hasn't achieved outside of a handful of cutscenes.
 
I long for the day when the word "cinematic" is no longer associated with gaming because we have developed a well-defined lexicon of our own without borrowing from other mediums.
 
All of these complaints, and it will still be one of Ubisoft's top selling titles, and thus they will simply rinse and repeat.
 
To me that seems to be the main point, sacrificing frame rate for better physics or effects is fine (as long as it's a steady 30fps).

But skirting around it with this cinematic nonsense is asking for trouble.

Yep shame they have both consoles at 900p so there is no debate and stuff. So they aren't pushing to make things better really.

Actually if a game has better physics and particle effects onscreen, its the best case scenario to have higher frame rates.

Example: AC:Unity has some amount of cloth physics, having the visual output of the cloth physics reacting at 60 or more updates per second on 60FPS will give a better visual experience than half the updates and half the FPS.

Its always a good thing to update anything dynamic (like physics or particle effects) at a faster rate and then display it the quickly onscreen.
 
lol, bullshit like this is why I can't stand this company.

Tip to Ubisoft - stop treating your audience like they are "the other". Stop condescendingly talking down to the people that make your job possible. We're people, not mentally deficient children.
 
Actually if a game has better physics and particle effects onscreen, its the best case scenario to have higher frame rates.

Example: AC:Unity has some amount of cloth physics, having the visual output of the cloth physics reacting at 60 or more updates per second on 60FPS will give a better visual experience than half the updates and half the FPS.

Its always a good thing to update anything dynamic (like physics or particle effects) at a faster rate and then display it the quickly onscreen.

What I was getting at is you could understand 30fps if they were pushing both consoles to the max. Obviously higher res and FPS will always look better.
 
Top Bottom