I never thought about that... I want a 120hz monitor (even more) now.
Yep. It's the main benefit of 120hz televisions. Smoother dvd playback. Just remember to turn off the motion interpolation stuff.
I never thought about that... I want a 120hz monitor (even more) now.
Framerates around 24fps have noticeable stuttering even if only on a somewhat subconscious level when it comes to movies most of the time where we tend to only notice stuttering when the camera pans at a certain speed (which is why they avoid that). Thing is, for storytelling, this super-subtle stuttering actually has a particular psychological effect that works well for cinematic purposes. It's almost like you're aware that you're watching a fast photo flipbook telling you a story. Speed things up to 60fps and this effect is lost, giving you the 'reality tv' feel. Neither is better than the other but they both serve different purposes.
This is my own theory on the subject matter and I strongly believe in it from all the experience I've built up with both CG, real-time graphics and real-life cinematography.
It probably only actually works when it's subtle and the game isn't THAT dependent on low LCD response times. But yeah, generally it's better to have lower refresh rates, but I imagine any decent quality TV from within the last decade will do a pretty good job there, unlike monitors from around the late 90s/early 2000s.
I do wonder if there might be some TV processing to account for too. At any rate I'm sure that whatever reason a game "looks better at 30 FPS" on PC is going to be tied to a TV and when hooked up to it WILL look "better" in the same way a console does. And hopefully that's in contrast to a PS4 or at least XB1, not a PS3 or 360.
That 30fps feels smoother on consoles than pc.
Perceptions are common, common perceptions are also pretty common. They are just perceptions of intense subjectivity till someone proves to me that said perception touches some sense of common truth. I do not share in that perception.The idea is the perception exists for a reason.
Maybe his encoding broke? Not sure.I've tested with motion blur turned off and it still looks nothing like the Watch_Dogs videos Dennis posted.
No, I don't believe it does at all in this case. The argument is regarding a "filmic look" in games, and films would judder if motion blur didn't exist. Games wishing to closely ape movie ascetics need to implement motion blur to get closer to the intended look.As I stated in another thread, motion blur should not be used as a band aid to cover up a game unsteadily juddering along at 20fps - 30fps. That crap needs to stop.
I laughed.(A quick review of our conversation):
While it may just be come kb/m versus controller thing, I have to assume A LOT of that "feels smoother" is still in the display, and I was personally going by experience with FFXIV on my PC versus a TV WITHOUT motion interpolation.If you have the TV's motion interpolation turned on it can make it look smoother, but you get horrible latency, making nearly all games unplayable.
Nothing says that this has to be the TV though, plugging a controller into a PC and playing at 30FPS will prove that.
What is cheap about broadcast quality cameras?
And if resolution is the reason 60i looks cheap, then a 480i CRT, or an ad being shown in a small online video window, would ruin that 'film look' (3:2 pulldown aside).
Real life doesn't move in fast forward like 60fps looks
"Tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second."
Whenever I play something that is CG heavy (like FFXIII) I turn on auto motion plus to high for that 60fps look. Sure, it introduces some weird artifacts when the image pans too fast horizontally, but it just looks better. That's just my opinion, of course.
I did speculate on a possible uncanny valley effect, but if I'm right that'd be eliminated by even higher fps, not lower. Maybe if we can get true 240hz monitors it can be eliminated as far as the eye can tell.Guys, I don't know in what alternate reality you live, but no way in hell I feel real life less smooth than anything I saw running at 60FPS.
Guys, I don't know in what alternate reality you live, but no way in hell I feel real life less smooth than anything I saw running at 60FPS.
Because 60 fps is already a slideshow, 120 fps is where it at. If u wanna play a game at 30 fps, might as well go to a museum and look at paintings.
It also adds like 100 or 200ms of input lag.
Maybe, I've got my PC hooked up to the TV alongside a few consoles and I've not noticed one being smoother than the other at similar frame rates.While it may just be come kb/m versus controller thing, I have to assume A LOT of that "feels smoother" is still in the display
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?
I'm just quoting myself here cause I'm right and most of you are just bat shit insane when it comes to this topic. Srsly.
On a more serious note (not that I didn't sorta mean what I just said), I do simplify things a bit in the above explanation but I am sure that that's what it comes down to. I say I simplify things cause I'm not taking motion blur/shutter speed into account and such but overall that just means that fps is not the only factor and in that sense things are slightly more complex but not by much.
In short: I just explained everything you need to know to understand the issue in that quote. Read it. /thread
Resolution and fps are the last thing I look for in a game.....as long as its at least 60 fps who cares?
Not that I really think this, but you can see the point.
How many fps does real life run in?
Yeah, it's really sort of frustrating at times the Hollywood envy that seems to be in games. Doubly so when it actually glitches, you get stuff like in Tomb Raider reboot water effects on the "camera" that don't ever go away, so you have to quit out and load it back up. Sometimes it's amusing to see, like a hornet crawling on a lens in MGS3, but it really does feel sometimes like they're not trying to bring us into a world but try to make it like we're watching a movie or TV show. And I'm not even THAT big on immersion, but being interactive I'd rather they err on the side of what actually makes sense for you being there rather than replicate our filming technology when it doesn't make sense or serve any purpose.]I guess my big problem with the "cinematic" and "filmic" approach is that I don't see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.
I think my favorite example is this.
http://dudelol.com/oldimgs/hdr-in-photography-vs-hdr-in-video-games.jpg
Similarly I swear on PS4 FFXIV looked basically identical to when I played the PC version on the TV. I tend to not have that issue with input lag, the worst is when I've tried streaming stuff like Onlive, though that can be manageable for the right games. Aiming can be a bitch though, which funnily enough I imagine could be what screws over any hypothetical cloud streaming future while shooters are the dominant genre.Maybe, I've got my PC hooked up to the TV alongside a few consoles and I've not noticed one being smoother than the other at similar frame rates.
That said, I can't play a game that has VSync enabled if I'm using a mouse/keyboard. I can barely control a simple pan around due to the extra latency, yet I can quite happily play for hours with VSync on if I'm using a 360 controller. Without that I'd probably downplay the effect different input devices have on percieved "smoothness" whilst playing.
Also, for people that just have to have 24 fps, couldn't a game be built to have all physics calculated at 60 hz or higher, have the inputs polled at 60 hz or higher, etc, to give the fluidity of control desired, while presenting the visual information at 24 fps the way some people like it? Add motion blur until people are satisfied - wouldn't that do the trick?
Also, I decided to enjoy this whole debate and show a scene (obviously not interactive for the viewership, which would infinitely prove 60fps' worth) where high motion is a detriment to a low framerate. I also chose a game with one of the best motionblur implementations next to Ryse, which would be Metro Last Light. Rendered separately at locked framrates with Vsync so as to make the motionblur stepping work correctly.
24fps
http://a.pomf.se/klrvan.webm
60fps
http://a.pomf.se/orutqz.webm
I think the nuances of camera work are so ingrained into our perceptions that its worth doing these things. I have no problem with it, to me its visual effects at this point. No it's not realistic, but then I am watching it on a TV screen in any case. I'm not expecting a totally realistic interpretation of the world, I expect it to look like everything else I see on a TV screen]I guess my big problem with the "cinematic" and "filmic" approach is that I don't see the world through a camera lens, so a lot of the post processing additions seem to just interfere with visual clarity and gameplay. Playing games like shadowfall and battlefield, I found it hard at times to see... anything. That's not good, ever. Developers need to learn subtlety and refinement. While I appreciate their rigor to their aesthetic, I think they need to roll back a lot of the motion blur and depth of field (things that just don't happen in real life to any degree we see in games), it's sad to see great art and texture work that's completely hidden under a wall of muggy effects.
I think my favorite example is this.
![]()
I think most people hold 30fps to be the bare minimum for playability.
I thought the Hobbit movies looked strange, not in a good way.
(Last of Us, Gears of War, God of War, Journey, Halo, Flower) that pushed hardware to the limits resulting in some stellar games that would not have been the same if they held to a 60fps standard.
I think the nuances of camera work are so ingrained into our perceptions that its worth doing these things. I have no problem with it, to me its visual effects at this point. No it's not realistic, but then I am watching it on a TV screen in any case. I'm not expecting a totally realistic interpretation of the world, I expect it to look like everything else I see on a TV screen
We need James Cameron to show us the true power of 48fpsI was just watching The Hobbit DoS bluray and some of the sets look strange even at 24fps. They look fake. Maybe it's the lighting? I don't know. I want to see other movies at 48fps or 60fps.
As the person playing, it was absolutely playable. Not as great as 60fps of course, but I was able to play. I was even able to parry an enemy, and dodge through the turtle enemies swings.
It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
Enemy attack animations play out slower the lower the framerate is. As in, the speed is locked to the framerate, so technically, playing at 24 FPS should make the game easier.
Journey is barely even a game.
Oh god I really hope you're kidding
As i understood, we can neither measure nor detect time shorter then 1 Planck unit.
That doesn't mean that time shorter then 1 Planck unit doesn't exist.
I'm sticking with infinite.
Resolution and fps are the last thing I look for in a game.....as long as its at least 30 fps who cares?
OH GOD SO HORRIBLE 30fps 720p GAWD DAMN MY EYES ITS SO SHITTY
![]()
Holding to a 1080p 60fps standard for this gen will hold a lot of cool potential gameplay and graphics back. I would argue a lot of 60fps games last gen were basically better looking PS2 games. Lots of small corridors, arenas, or tracks or arenas. Very few 60fps games last gen really pushed new gameplay or amazing graphics (Ill give you Rage, Burnout, and COD), so holding it as the gold standard doesnt mean it results in better games. Most of the best franchises last gen (Last of Us, Gears of War, God of War, Journey, Halo, Flower) that pushed hardware to the limits resulting in some stellar games that would not have been the same if they held to a 60fps standard.
And isn't Journey coming to PS4 (presumably in 1080p/60fps) or is that just me dreaming?Flower and Last of Us are 60 fps on PS4. The new Halo will be 60 fps. Downgrades all around.
Or the complete lack of "loading screens"
I'm not, speaking for my own experience, sadly.![]()
Personally, I only play games capped at 1fps to get that 'comicbook' experience.