The Order: 1886 is 30fps because 24fps doesn't "feel good", 60fps "changes aesthetic"

In this instance, yes it does. Because it allows them to push the visuals and effects to provide a more realistic and filmic look, beyond what they'd be able to do at 60fps.

filmic yes realistic no, light moves well beyond double or single digits can't be realistic if your limiting the very speed at which see motion or light.

All them dropping to 30fps does is what you mentioned up the visuals cause they can't do it at 60fps with the same assets and effects. Even then from the footage we have seen it's unstable and their visual are the sole reason why considering what they have mentioned and what we know what fov and motion blur can cost computationally.
 
filmic yes realistic no, light moves well beyond double or single digits can't be realistic if your limiting the very speed at which see motion or light.

All them dropping to 30fps does is what you mentioned up the visuals cause they can't do it at 60fps with the same assets and effects. Even then from the footage we have seen it's unstable and their visual are the sole reason why considering what they have mentioned and what we know what fov and motion blur can cost computationally.

No, it allows them to have graphics that are more filmic AND realistic. This shit is borderline CGI quality. You simply wouldn't get this level of visual presentation at 60fps, not on this hardware, or at this stage of the console development cycle.

iSPuFFUVsHgkd.png

iFXz2Iz4z00Vy.png

image_the_order_1886-25122-2752_0012.jpg

image_the_order_1886-25122-2752_0001.jpg
 
I don't think the Shadow of the Collossus example is really a good one here. That game was doing crazy stuff from both a gameplay and graphics, whereas The Order looks, from what I have seen (which isn't much to be fair, but I've checked out a few gameplay videos) is a mostly standard, albeit gorgeous third person shooter. I haven't seen anything that blew me away in terms of scale like SotC did at the time.
Yeah, this is more what you can possibly throw at Witcher 3, not The Order.

And I think for a lot of people it's more HOW this is being pushed anyway, if it was just "we can fit more visual effects in at 30" it'd be the same old, but this time we have them saying 60 fps doesn't create the FEEL they want and they even seriously toyed with the idea of going at 24 fps. It's making a lot of us feel they got somewhat screwy priorities, even for those of us OK with a game being at 30 fps.
 
I don't disagree. They've gone hard out to give the game a certain aesthetic, 60 FPS wouldn't really match the rest of the picture they have built. That said, I don't think anybody was expecting 60 FPS with the visuals they are pumping out.

In films I'm reminded of the Hobbit in 48 FPS, the whole movie moves like a Benny Hill show
 
No, it allows them to have graphics that are more filmic AND realistic. This shit is borderline CGI quality. You simply wouldn't get this level of visual presentation at 60fps, not on this hardware, or at this stage of the console development cycle.

Filmic yes realistic no unless your saying it to some degree replicates how camera work in a realistic matter, in regards to our eyes or how eye can work no not in the least. Two different points being said pick and choose what you really mean I'm not arguing the right to say my opinion or others just cause you don't like what's being said in terms of objective facts anyone can measure.
 
In this instance, yes it does. Because it allows them to push the visuals and effects to provide a more realistic and filmic look, beyond what they'd be able to do at 60fps.

Pushed visuals don't equal a cinematic experience. It means better visuals, nothing more or less.

What you classify as 'better games' isn't necessarily the same as what everyone else does. If some of you '60fps afficianados' weren't so narcissistic it'd probably be easier to have a proper discussion, rather than the reductive bullshit you're spouting here.

I have never seen a compelling argument that games should be played in 30 fps.
 
Filmic yes realistic no unless your saying it to some degree replicates how camera work in a realistic matter, in regards to our eyes or how eye can work no not in the least. Two different points being said pick and choose what you really mean I'm not arguing the right to say my opinion or others just cause you don't like what's being said in terms of objective facts anyone can measure.

Filmic and realistic are not opposing terms. It can be filmic and realistic. The presentation, camera work as you said is filmic. The character models, lighting, texture work is realistic
 
Be honest and say you're pushing the game visually as far as the system can go.

All devs who make AAA budget games push the system as far as they can push it at the time. Some find more efficient methods than others and some have better artists than others, but their job is to make the best compromises they can because internally each dept and employee wants the best quality they personally can make because that work is going to go on their portfolio reel for their next job.

Unfortunately modern game hardware can't handle the full brunt of each game asset or feature. The directors do get to see all the content and features and try them out and get to decide if the product is better with one over the other. They have to decide early on if all this cool new technology that their staff has worked hard on is worth using.....or if a feature like 60fps ends up being more important and making the game more unique.

They have to ship the game at some point and it's easier to make large sweeping changes on their next title rather than redoing millions of dollars worth of work and delaying the game just so they can eek out another 10% in visuals and/or performance headroom.


Anyway, no AAA developer will use leave a huge % of system resources on the table unless they're in a rush to ship the game or in a case where nothing else really makes the game significantly better.
 
Filmic yes realistic no unless your saying it to some degree replicates how camera work on realistic matter in regards to our eyes or how eye can work no not in the least. Two different points being said pick and choose what you really mean I'm not arguing the right to say my opinion or others just cause you don't like what's being said in terms of objective facts anyone can measure.

No, realistic as in graphical features that make games look more real. Eg, better subsurface scattering on skin, better lighting, shadows, post processing, volumetrics, colour grading, textures, depth of field and other focus effects, per object motion blur, light flare, better material properties and reflections, higher poly characters and surrounding geometry, better image quality, anti-aliasing etc etc.

These things are computationally expensive, but are things that make graphics in games look more realistic. That's the main point of going 30fps over 60fps, to allow for more realistic, or "filmic" visuals.
 
No, realistic as in graphical features that make games look more real. Eg, better subsurface scattering on skin, better lighting, shadows, post processing, volumetrics, colour grading, textures, depth of field and other focus effects, per object motion blur, light flare, better material properties and reflections, higher poly characters and surrounding geometry, better image quality, anti-aliasing etc etc.

These things are computationally expensive, but are things that make graphics in games look more realistic. That's the main point of going 30fps over 60fps, to allow for more realistic or filmic visuals.

Noone is arguing against that they choosed 30 fps because of hardware limations. Thats kinda the point of the criticism.
 
No, realistic as in graphical features that make games look more real. Eg, better subsurface scattering on skin, better lighting, shadows, post processing, volumetrics, colour grading, textures, depth of field and other focus effects, per object motion blur, light flare, better material properties and reflections, higher poly characters and surrounding geometry, better image quality, anti-aliasing etc etc.

These things are computationally expensive, but are things that make graphics in games look more realistic. That's the main point of going 30fps over 60fps, to allow for more realistic, or "filmic" visuals.

Thank you for clarifying. I already know things are computationally expensive and have mentioned that. I also mentioned why it's stupid cause your display will gimp all those thigngs you did to make it prettier. It's like buying an hdtv and then smearing vasoline over it.

Filmic and realistic are not opposing terms. It can be filmic and realistic. The presentation, camera work as you said is filmic. The character models, lighting, texture work is realistic

The way he was using it can be especially if you don't know what he means by it. My complaint is as always say it yes the assets are improved but the displays we use gimp the overall ability to see the much improved work.

such bullcrap. i don't even know where to begin.

Try considering you're the one claiming it's bullshit. Whoops I forgot you want to make a point with no real substance.
 
I don't disagree. They've gone hard out to give the game a certain aesthetic, 60 FPS wouldn't really match the rest of the picture they have built. That said, I don't think anybody was expecting 60 FPS with the visuals they are pumping out.

In films I'm reminded of the Hobbit in 48 FPS, the whole movie moves like a Benny Hill show
But the Order doesn't look real. What you're saying is so silly because games look like "games" and should be ran at the highest frame rate as possible. The Order can't be ran no higher than 30fps and the developers need to just call it like it is.
 
I didn't interpret the quote in the OP to mean "We tried it at 24 FPS first before deciding on 30."

It's common knowledge that 24 FPS doesn't feel good to play. And honestly with the graphics looking the way they are, a locked 30 is completely fine for a single player only game.
 
Thank you for clarifying. I already know things are computationally expensive and have mentioned that. I also mentioned why it's stupid cause your display will gimp all those thigngs you did to make it prettier. It's like buying an hdtv and then smearing vasoline over it.



The way he was using it can be especially if you don't know what he means by it. My complaint is as always say it yes the assets are improved but the displays we use gimp the overall ability to see the much improved work.



Try considering you're the one claiming it's bullshit. Whoops I forgot you want to make a point with no real substance.


and what did you think he meant by "realistic"?

"oh it isn't realistic because light travels so fast so 30 fps doesn't mimic the real world. oh you mean the way the graphics look? mkay then."

you're trying to be a wiseass about it. or are you going to say cod is more realistic because it refreshes images closer to how fast the eye receives photons? please.
 
Thank you for clarifying. I already know things are computationally expensive and have mentioned that. I also mentioned why it's stupid cause your display will gimp all those thigngs you did to make it prettier. It's like buying an hdtv and then smearing vasoline over it.

Despite your gross exaggeration, even that in a way is more realistic. Movies aren't usually completely pixel perfect and tack sharp. Just look at captures from 1080p blu-ray transfers. Ultimately, all that matters is how realistic and life like it looks, and right now The Order 1886 graphically, is the most photo realistic or cgi'esque game out there. So whatever they're doing or have done, works.
 
I don't see why this is so complicated for people. Films (and filmic productions like modern TV drama) are typically associated with lower frame rates, I.e. 24, 25 or 30 fps. Television, specifically cheaper content (sitcoms, reality programs, soaps, live sports, etc) is typically associated with higher framerates, I.e. 50 or 60 fps. As a result of these standards, we typically interpret lower frame rates as looking more "filmic".

This is why a bunch of people walked out of HFR screenings of The Hobbit with the distinct impression that it looked "cheap", or like behind-the-scenes footage.

I think RAD know exactly what they're doing.

I swear if anyone tries to pull me up on the progressive vs interlaced thing, I'll snap. I'm simplifying for the purposes of an easier-to-understand explanation.
 
and what did you think he meant by "realistic"?

"oh it isn't realistic because light travels so fast so 30 fps doesn't mimic the real world. oh you mean the way the graphics look? mkay then."

you're trying to be a wiseass about it. or are you going to say cod is more realistic because it refreshes images closer to how fast the eye receives photons? please.

Plenty of things including that which he listed, hence why I asked.

I given him plenty of other reasons which have been listed various times as to why it's not realistic or for the sake of someone like yourself as real as it could be. So light wasn't the only thing going on that I was referring to about realism If you want that list just pm me instead of indicting my opinion and then being snarky. Nothing is or was directed at you so this really has nothing to do with you but I'm a wiseass while you butt in to a situation that wasn't yours and call me full of shit.

Despite your gross exaggeration, even that in a way is more realistic. Movies aren't usually completely pixel perfect and tack sharp. Just look at captures from 1080p blu-ray transfers. Ultimately, all that matters is how realistic and life like it looks, and right now The Order 1886 graphically, is the most photo realistic or cgi'esque game out there. So whatever they're doing or have done, works.

Duh and I'm not saying they aren't. Slow down a minute and actually read what is being typed.

Lets ignore peak fps in this debate and focus on the on filmic look and what it will do to the stability of the fps be it 60fps or 30fps for that matter. That premise is totally valid considering footage we have seen is unstable, how much or how often is not the point it being that at all is. While they are making decent stride in recreating aspects of the filmic look some of them are redundant. FOV eats in to fps if it didn't more console games last gen especially shooters would've enjoyed a bigger one, fov is not movie only thing its more about the camera than anything else. These devs by their own admission have admitted they can't achieve 60fps but one of the effects they are doing eats at this no matter so anyway you slice it they are doing something in a shooter that quite frankly will lead to fps drops. The other big filmic point I can make and will make is the blur. If their blur was out of this world and done in a way that nothing else out there is being done I would shut my mouth but it's not that special and consideirng the cost and what is already in games I feel its dumb it's not worth the big hoopla going around.

As for your last bit not by a mile games still look like games to me and arguing which one is more realistic without having all the info devs have is something that those with an axe grind to like to say. Nor is it a game that is out so basically you're championing a game that is in development and that by the end of generation won't keep such an achievement.
 
Yea right, that's a BS excuse and you know it. They can't do 60fps. How about offering an option for 60fps if it's possible?
Who said it is possible? They would have made a decision at the start of development to make a 30FPS or 60 FPS game, and they went with 30 FPS which allowed them to achieve their "filmic" desires.
 
I guess my big problem with the cinematic and filmic approach is that I dont see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg
 
If they genuinely want a "filmic look" then the gameplay has to match and the game has to be 24fps, unless of course they want the HFR look and feel (which apparently they haven't given consideration to). As usual with such a statement, you can guarantee that the gameplay will be contradictory to a filmic look... because it's a fucking video game

I guess my big problem with the cinematic and filmic approach is that I dont see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg

Indeed this is one of the many examples of where even a claim of going for a "filmic look" will be contradicted with an incessent need for such over the top effects
 
I guess my big problem with the cinematic and filmic approach is that I dont see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg



Kinda hard to get artists to well....not be artists

The effects will get more refined and better with time, but it's hard to mimic reality when you're making artistic choices to tone-map the lighting into a way human eyes don't see light. Our pupils dilate for a reason;)

I think the Dark Souls people on NeoGaf just had a huge freaking thread about how they can't stand the always-on ambient lighting they added to Dark Souls 2 to make everything a little more clear to see. They wanted the dark areas to mimic Tomb of the Giants in DS1 where you couldn't see past 2 feet in front of you.
 
Kinda hard to get artists to well....not be artists

The effects will get more refined and better with time.

I think the Dark Souls people on NeoGaf just had a huge freaking thread about how they can't stand the always-on ambient lighting they added to Dark Souls 2 to make everything a little more clear to see. They wanted the dark areas to mimic Tomb of the Giants in DS1 where you couldn't see past 2 feet in front of you.

the problem is that you're not supporting your artists. You're artists spent time building the world, putting in those textures, filling it with awesome content and making a place beautiful to be in, and you're deciding to cover all of that with over-blown effects, dof, motion blur, lensflares, bloom lighting, hdr and everything else that just takes everything away from what your amazing artists worked on.

being subtle inhances all that art and work put into your game, but few developers are subtle with such effects. When you see rain or snow, it's barely ever a light drizzle, it's almost always a blizzard or storm.

At times liberty is taken, mass effect was purposefully going for the 80-90's science fiction effect, and so they included things like film grain and lensflares, but thats intentionally fitted within all the concept design and art of the game to the point that without them it wouldn't speak to that space drama, but in many games (not saying the order is doing it here), it's there because a programmer made it or it's the new popular thing to do.
 
If they genuinely want a "filmic look" then the gameplay has to match and the game has to be 24fps, unless of course they want the HFR look and feel (which apparently they haven't given consideration to). As usual with such a statement, you can guarantee that the gameplay will be contradictory to a filmic look... because it's a fucking video game



Indeed this is one of the many examples of where even a claim of going for a "filmic look" will be contradicted with an incessent need for such over the top effects

The game approach works better in that realm though. It's all artistic decisions in game making but I hardly lose sleep over the real life vs game comparison since games are tailor made to evoke what the dev wants it to say.
 
Seems framerates are all the rage these days, I thought some people would be interested. More at the link.


Source: http://kotaku.com/a-developers-defense-of-30-frames-per-second-1580194683

Would of been great if you also put this in:

"Then, on top of it, I don't know of any other games that are gonna look like our game in real-time with no pre-rendered movies, with all the stuff that's going on lighting-wise, and run at 60. I think that's probably the thing that most people underestimate is [that] to make a game look like this—the way that they're lit, the number of directional lights that we have… We don't have a game where you're just outside in sunlight, so there's one light. We have candles flickering, fires, then characters have lights on them. So [to make] all those lights [work] with this fidelity means, I think, until the end of this system most people won't have any clue how to make that run 60 and look like this.

Bolded the whole statement because all of it is important for accurate comprehension.

If it looked less real VR wouldn't be needing 60fps at the very minimum. They just don't want to say "Yeah we can't do the graphics we have right now at 60fps with the PS4". It sounds better to say it's an artistic choice.

VR is different than viewing through an HD TV. A higher framerate is necessary to prevent sickness, whereas that is not always so with HD TV.

All in all it sounds like most if not all of you didn't bother to read the entire article. I never took any of this stuff as RAD denying that 30 fps target was also due to limitation set by PS4 hardware.

We also can not deny that they have chosen 30 fps and 800P for their artistic vision of the game, so I wish people would just accept that and move on. While it would be unprofessional for them to claim that filmic feel is the only reason for them choosing these specifications, it is silly for us to try saying that it is not a main reason also.

If 60 was their goal they could have obtained it with concessions.
 
the problem is that you're not supporting your artists. You're artists spent time building the world, putting in those textures, filling it with awesome content and making a place beautiful to be in, and you're deciding to cover all of that with over-blown effects, dof, motion blur, lensflares, bloom lighting, hdr and everything else that just takes everything away from what your amazing artists worked on.

being subtle inhances all that art and work put into your game, but few developers are subtle with such effects. When you see rain or snow, it's barely ever a light drizzle, it's almost always a blizzard or storm.

At times liberty is taken, mass effect was purposefully going for the 80-90's science fiction effect, and so they included things like film grain and lensflares, but thats intentionally fitted within all the concept design and art of the game to the point that without them it wouldn't speak to that space drama, but in many games (not saying the order is doing it here), it's there because a programmer made it or it's the new popular thing to do.


I don't know man, film grain and lens flares are not a product of the 80's and 90's, they're a product of analog film cameras and camera lenses. It's fine if you don't like them though, but they are a crucial element to making CG work not look as CG.

Most CG artists actually do use post effects in their work to some degree - myself included. I agree that certain ones are overdone. They haven't yet learned restraint. It'll happen though.


From a business side though, if a company just sunk $4 million into developing a weather effects system, they damn sure are going to make sure it's seen, otherwise if it's just a tiny drizzle and that's it, people are liable to miss it and what was the point then when that money could have been used to add another area or two?

The tech builds upon itself and the artists are always trying to out-do what they did before. If they want to play things safe or ship a game quick, they can just keep an existing workflow and target 60fps. Later though, they'll want to show off how much better they've become and drop the frame-rate to 30.
 
I don't know man, film grain and lens flares are not a product of the 80's and 90's, they're a product of analog film cameras and camera lenses. It's fine if you don't like them though, but they are a crucial element to making CG work not look as CG.

Most non-gaming CG artists actually do use post effects in their work to some degree - myself included. I agree though that certain ones are completely overdone - that always happens when artists get a new tool in their toolbox. They haven't yet learned restraint. It'll happen though.


From a business side though, if a company just sunk $4 million into developing a weather effects system, they damn sure are going to make sure it's seen, otherwise if it's just a tiny drizzle and that's it, people are liable to miss it and what was the point then when that money could have been used to add another area or two?

The tech builds upon itself and the artists are always trying to out-do what they did before. If they want to play things safe or ship a game quick, they can just keep an existing workflow and target 60fps. Later though, they'll want to show off how much better they've become and drop the frame-rate to 30.

Yes, they're a product of the film and lenses, but they were directly used in the 80/90's as a method of expressing the medium, much like a lot of content today uses glitch aesthetics to help enforce a level of technicality. Watch-dogs is a good example of including artifact, data bending and other aesthetic choice that play to a nature of computerization and technological advancement. This was exactly what filmmakers were doing in the si-fi hay day. They were trying to directly reference the technological backings and "futurization".

I'm not saying removing post processing is a good thing, or it's a necessity, I often do multiple passes of post processing to ensure that effects properly sit cohesively in an image/movie/environment, what I'm saying is that those effects work best when its subtle. From an aesthetic standpoint, I think this game looks phenomenal, I think they have a strong cohesive design and they're engine and artists are working to a unified vision, I'm talking more in general.

Yes, that tech is esxpensive to build, but so what, you can't let price dictate the direction. If you build a really advanced weather system, and you use it for subtle effects, then thats a much better use money, than taking that same system and covering all that time, money and effort spent on producing amazing art and assets. Balance is key, and developers in general seem to lack the ability to balance effects, as a result we have these trends of effects that are over-blown and at times just jarring and attention grabbing.

The best design is always the design you never have to think about, if people are constantly saying "this is effect is really good" then that's not always a good thing in the context of a whole experience.
 
I have never seen a compelling argument that games should be played in 30 fps.

What you find compelling and what other people find compelling can be two completely different things. People don't have to agree when having a discussion, but they should respect the other persons viewpoint rather than being dismissive.
 
This whole "filmic" thing is getting out of hand. It's almost as if people don't understand the difference between a non-interactive film captured with a physical camera and images generated and rendered by a computer.
 
Changes aesthetics? That's a joke right? Would you lock at 30 FPS if it were a PC game as well? Why can't these developers just admit they can't achieve
800p
at 60fps?

Hahahaha.

800p? I didn't know the order was 1422x800 resolution?

Anyway. silly argument. FPS in PC games in general aren't locked because it's not fixed hardware. So there's no point. In a console environment the devs can make that decision and work around that.

hahahahahahaha?

Serious question, do PC gamers who play at 120/144fps with lightboost enter PC threads shitting on the plebs playing at a meagre 60fps?
 
The frame rate I'm comfortable with varies almost completely by context. Single player experiences against a computer, I'm fine with locked 30fps. (Generally.)

Multiplayer against live people, 60fps is essential to actually respond as quickly as necessary.

Same with film. 24-30 FPS for dramatic/narrative experiences is preferred. In scifi, fantasy, or anything else that leans heavy on the artifice of the medium, 48-60 fps is often more than you need, or even want, to see.

I don't need to see "LOTR: The High School Stage Show" at 48-60fps, rendering at a visual fidelity that the production design and stagecraft just doesn't support. 24fps feels like a filmed story. 48-60 feels like a taped stage production. It's incredibly hard to get away from.
 
Anyway. silly argument. FPS in PC games in general aren't locked because it's not fixed hardware. So there's no point. In a console environment the devs can make that decision and work around that.

Some devs do this and EA is one of them in regards to NFS and how it performs on the pc as of late.


Serious question, do PC gamers who play at 120/144fps with lightboost enter PC threads shitting on the plebs playing at a meagre 60fps?

I'm one of the few with a lightboost or some form of strobing monitor that can do it and no I don't. I don't care what people prefer I only go after devs/publishers especially when fps is being thrown under the bus or being used as a scapegoat for the fact they can't make stable fps at whatever rate they desire.
 
I guess my big problem with the cinematic and filmic approach is that I dont see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.

I think my favorite example is this.

hdr-in-photography-vs-hdr-in-video-games.jpg

Bloody hell lol, they are not the same thing. High Dynamic Range in photography is basically where you take the lightest and darkest areas of exposure from multiple photo's, and then process and blend information together to end with a final shot of more balanced or neutral overall exposure and luminance.

HDR in games is largely the opposite, it aims to represent a larger range in lighting, in other words a more accurate rendering of the differences between light and dark, whilst retaining details, for more realistic overall lighting. The problem is, often HDR implementation in games is too strong, causing bloom like over intense light areas.
 
I don't mind 30 fps for games on consoles but to say it's aesthetic is just a terrible spin to me. Take off the aesthetics and give me an option to choose 60 fps if that's the case. I'll happily play and watch the game scenes like a soap opera.
 
I can't believe people are actually eating up the 24fps cinematic experience shit :lol
I have only ever heard that phrase being used jokingly until now.
 
I don't mind 30 fps for games on consoles but to say it's aesthetic is just a terrible spin to me. Take off the aesthetics and give me an option to choose 60 fps if that's the case. I'll happily play and watch the game scenes like a soap opera.

How many 60 fps games looked like a soap opera to you?
 
captured with a physical camera

That's what filmic is!

That's the aesthetic they are trying to simulate, and its worthwhile because its a well established and loved medium.

Honestly, complaining about it is like bemoaning that Limbo wasn't as colourful as a Nintendo title!

Its a valid artistic choice.
 
i am fine with 30fps for some games but pls just tell the truth like "Our game is not really fast paced and we want to get the best Graphics, thats why we choosed 30fps"


And 24fps... lol


I just started to replay Gothic 1-3 and Gothic 1 has a 25fps cap without a fix. With the fix you get 60fps+, but there is one thing that doesnt work in 60fps in gothic (high climbing) and i had to go back to 25fps for a short time (restart whole game). In 25fps mode i got pretty sick really fast xD

edit: and if i play a game with Mouse and Keyboard 60fps is a must have for me. You just have way faster camera movements with a mouse, i couldnt handle that in 30fps. If you have a game controlled with a pad and the camera movement is limited in speed 30fps can be fine.
 
I can't believe people are actually eating up the 24fps cinematic experience shit :lol
I have only ever heard that phrase being used jokingly until now.

They've been talking about it like that for a year now.

PlayStation.Blog: So, exactly what kind of game is The Order: 1886?
Ru Weerasuriya: The Order is a third person action adventure with shooting mechanics. It’s very much story-based – it’s a linear story-based game. We’re trying to tell a story. It’s what we call a filmic experience.

PSB: What aspect of the game do you think will make gamers say “Woah, that’s new, I’ve never seen that before”?
RW: There are gameplay features we’ll be talking about that will be very, very cool. Right now we’re playing with things – the moment-to-moment gameplay is really not what you might expect. We didn’t want to make it single-tone, where you rely on one single thing in expense of the rest.
The overall feel – that filmic experience… the one thing we brought to this is something people are accustomed too but usually can’t tell. When you watch a movie you don’t question what lens is being used. You don’t question why there is grain on the film or why there’s a certain lighting. Those are things we’ve been accustomed to seeing for 30 years. So when it’s missing we usually go “Wait, something is wrong with this image”.
With this game we replicated a lot of physical attributes. We have true lens distortion. We built physical lenses into our engine so we could get something where people will look it and not be totally disconnected. Games have a tendency sometimes to be too clean and crisp. We thrive in the dirt. We just love the fact that it feels dirty. It’s filmed in a very realistic way.
 
How many 60 fps games looked like a soap opera to you?

da partially sarcastic part :D

Anyway, it's pretty hard to explain the choice of 30 fps over 60. Consoles should have more choices in their video settings now. If it's aesthetics, give me an option between 30 or 60 fps. If it's limitations on hardware, give me an option between high graphics + 30 fps or lowered graphics + 60 fps. I should be the one to decide what's better for me.

I feel the age of console gaming should've been to a point where these options are readily available. The hardware in these consoles are as close to PCs as they've ever been now.
 
It would be interesting to play a game that ran at 24fps and had the required motion blur to mimic actual film. Probably not a shooter like The Order, but a slower paced game with film quality IQ would be something to experience.
 
Seeing the pictures and the video of The Order should be enough to justify that the game is a visual showpiece

But the scrutiny of pixels and frames by the gaming media and fans, are overshadowing the creativity, atmospheric, and fun qualities of gaming

Even Sony themselves obnoxiously add fuel to the flames by touting 1080p/60fps as the mark of the best gaming experience. So I can hardly blame RaD for feeling like they have to justify their design choice to do otherwise
 
Top Bottom