Data Elemental
Member
It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
Those were running at an inconsistent 12fps after all.
It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
TVS. AND. MONITORS. THE DIFFERENCE IS IN THE DISPLAY NOT THE SYSTEM.
I posted on that earlier, and I've actively tried hooking a computer up to a TV before to see how it'd look. That's where the difference actually is. Consoles are just specialized computers anyway.
That weirdly doesn't actually look that bad, but half of that may be higher settings than consoles (or squishing the image down) and the fact I'm watching, not playing. It probably wouldn't feel that great actually playing at that and it's a huge, huge mistake for anyone to actually target 24 over 30, especially without motion blur.
Yet that extra 6 fps makes a huge difference there being a 25% increase, and 60's a multiple of 30 rather than 24 so the end result will come out better (then there's 120hz, but man why get a computer monitor at that refresh rate only to play games at 24 hz on it?) Yeah, it's not the worst thing or anything but on current hardware it seems kind of stupid to go for it, and it'd probably hit doubly hard in racing or shooting, at least Dark Souls isn't really that fast paced of a game.As the person playing, it was absolutely playable. Not as great as 60fps of course, but I was able to play. I was even able to parry an enemy, and dodge through the turtle enemies swings.
It doesn't feel terrible, and doesn't look anywhere near as bad as the previously posted examples.
Now to be fair, that appears to be a framerate generated without the normal optical blurring that characterizes what people mean when they're talking about a filmic look.
Guess what, video games aren't going to employ those film techniques either, so that's a moot point.
Motion blur is a standard visual effect in videogames. Done correctly, it drastically improves the apparent fluidity of motion.
Yes, and its done for games currently at 30/60.
You know what else will help fluidity? Higher framerates for you games.
I assume you mean lighting? And I don't know anything about it really, but am pretty sure that isn't the case. If it were, something at 24fps with the "wrong lighting" would have the soap opera effect as well, no?
Yet that extra 6 fps makes a huge difference there being a 25% increase, and 60's a multiple of 30 rather than 24 so the end result will come out better (then there's 120hz, but man why get a computer monitor at that refresh rate only to play games at 24 hz on it?) Yeah, it's not the worst thing or anything but on current hardware it seems kind of stupid to go for it, and it'd probably hit doubly hard in racing or shooting, at least Dark Souls isn't really that fast paced of a game.
The game could do that instead of the tv. Problem solved.3:2 pulldown sucks too much on too many TVs to go with 24fps over 30fps.
Lol, yeah, fucking phone. But no, it is indeed the lighting. Soap opera has the lowest quality, everything about it is cheaply made.
Now people on gaf are defending 30 fps for "cinematic purposes"? The PR people have won!
It's not the lighting, the majority of the problem comes from the 60fps. The lighting is bad, but would like more "film like" at 24FPS.
Now people on gaf are defending 30 fps for "cinematic purposes"? The PR people have won!
This is infuriating when somebody does this. How about you read the damn thread before making a post like this?
How many fps does real life run in?
You seriously don't realize that soap operas look considerably worse than anything on TV, also shot in 60 fps?
I'd been having trouble thinking of a response to this, but perhaps it's because there really isn't much to say. Yeah, you can give it a distinct feel, and make it look good still. Which is why movies still went with it even with CGI and going from film to digital where the old hangups no longer applied. There really isn't much more to say there, until you go into the fact games ARE ultimately an interactive experience and thus there's the question of if it can ever be acceptable, and when. And most likely that's only for games that actively are trying to be films, and not like the Order but like a David Cage game or LA Noire while investigating and not shooting. Shooting you absolutely need 30+, feels terrible trying to aim below that.I don't know how many times I can say this... the argument I've presented is not about how a game plays, only the idea of the "film look" or "feel", 24FPS would not look bad at all if it was built properly. I'm not saying they should strive for it, all I'm saying is that the look and feel would change.
Ah crap you are crazy!It might sound crazy, but I do have a 120hz monitor, and 24fps content like movies and TV shows (as well as games at 24fps!) look a lot better on the monitor, because 24 is a multiple of 120, and every frame is shown exactly 5 frames, rather than being showed inconsistent times.
You seriously don't realize that soap operas look considerably worse than anything on TV, also shot in 60 fps?
Now this is infuriating when somebody does this. How about you not assume I didn't read the thread when I did?
I'd been having trouble thinking of a response to this, but perhaps it's because there really isn't much to say. Yeah, you can give it a distinct feel, and make it look good still. Which is why movies still went with it even with CGI and going from film to digital where the old hangups no longer applied. There really isn't much more to say there, until you go into the fact games ARE ultimately an interactive experience and thus there's the question of if it can ever be acceptable, and when. And most likely that's only for games that actively are trying to be films, and not like the Order but like a David Cage game or LA Noire while investigating and not shooting. Shooting you absolutely need 30+, feels terrible trying to aim below that.
Ah crap you are crazy!
But yeah I kind of expect that for TV/movies that monitor really could be better, at least in regards to motion. Maybe not necessarily color or IQ, but being an exact multiple for sure rather than this pull down stuff must have a subtle yet significant improvement. Games I'd only do it for morbid curiosity at 4k downsampled with full AA on or something, and not something I'd stick with over 120.
I still make those 28.8 baud modem connection noises whenever I think too hard about something.My brain is still running an old intel chip and I stupidly bought one of those 'all in one brain' meaning it's not easy to upgrade![]()
Did I say that? The "soap opera" effect refers to motion interpolation or motion flow on tvs, which gives movies and tv that 60FPS look. You seriously didn't know what the soap opera effect refers to?
Yeah, basically the term "Soap Opera effect" come from the days when Soap Operas were filmed on 60fps interlaced video cassettes to reduce the cost of filming each episode. Which gave everything that floaty motion blur look when played back through TV broadcasts.
60 FPS only looks "strange" in film because that isn't the standard. That's it. There's nothing in 24 FPS (or 30 for that matter) that looks any more "realistic" or "grounded".
60 FPS also looks significantly closer to real life, because it's much closer to how people perceive the real world than 24/30 FPS is. If you're going for "realism", 60 FPS is actually more realistic.
Yeah...that's just you. There's nothing artificial about 60 FPS.
... what?
I really hate this line of thinking. Home video cameras have been shooting at 60i for DECADES, same with broadcast cameras. It looks cheap, and not because of the resolution, or quality of the video.
Did I say that? The "soap opera" effect refers to motion interpolation or motion flow on tvs, which gives movies and tv that 60FPS look. You seriously didn't know what the soap opera effect refers to?
The idea that games are unplayable at 30FPS is a total fallacy anyways.
some of the best games of the past generation were 30FPS, while also achieving better visuals, and in some peoples minds a better form of storytelling.
When people have one opinion they're just wrong.
When I have an opinion it's different.
This is actually part of why I linked that GameSpot video and you'd need to watch whenever it's convenient, because there is a point where your brain's no longer tricked into thinking it's motion but perceiving it more as a literal slide show, and that's at around 17 fps. But even below that games can be playable depending on the kind of gameplay they're going for, I think Ultima VII was something like 5 fps and that came out on PC during the 16-bit years, when many people were playing SNES or Genesis with games that ran at 60 fps at least for scrolling.Well there you go, this is my point exactly right on the damn head. The idea that games are unplayable at 30FPS is a total fallacy anyways, some of the best games of the past generation were 30FPS, while also achieving better visuals, and in some peoples minds a better form of storytelling.
While there's opinions, you can go way before 30 fps before a game objectively becomes unplayable, and that depends on the genre. Something like Chess would be playable at 1 fps after all, even if there's no damn way any modern action game can be playable at that fps."When people have one opinion they're just wrong."
"When I have an opinion it's different."
I don't know how many times I can say this... the argument I've presented is not about how a game plays, only the idea of the "film look" or "feel", 24FPS would not look bad at all if it was built properly. I'm not saying they should strive for it, all I'm saying is that the look and feel would change.
You're going to need to back up your assertion that obviously cheap elements are not the reason something looks cheap.
That 30fps feels smoother on consoles than pc.
What is cheap about broadcast quality cameras?
And if resolution is the reason 60i looks cheap, then a 480i CRT, or an ad being shown in a small online video window, would ruin that 'film look' (3:2 pulldown aside).
You forgot the TV. No seriously don't forget the TV.No, it does not!
If you plug a controller (with it's much less accurate motion vs a mouse) into a PC, a game running at 30FPS will feel the same.
You forgot the TV. No seriously don't forget the TV.
I haven't followed TV LCD response times lately, but it probably has to do with the fact that last I checked TVs had slower LCD response time than monitors did, so the shift in colors can sort of create a natural motion blur, and thus "look better." Just like with film versus a game running at 24 or even 30 fps without any (good) motion blurring applied.
It probably only actually works when it's subtle and the game isn't THAT dependent on low LCD response times. But yeah, generally it's better to have lower refresh rates, but I imagine any decent quality TV from within the last decade will do a pretty good job there, unlike monitors from around the late 90s/early 2000s.That depends on the display (motion blur is a lot stronger than what you get on any modern LCD) and even then, LCD blurring does not look very nice.
It might sound crazy, but I do have a 120hz monitor, and 24fps content like movies and TV shows (as well as games at 24fps!) look a lot better on the monitor, because 24 is a multiple of 120, and every frame is shown exactly 5 frames, rather than being showed inconsistent times.