48 grew on me after watching it a few times.
It looks too fast though. It grows on you a bit but then when I go back to 24fps it feels better still.
48 grew on me after watching it a few times.
24 looks much better as expected.
One thing that just occurred to me. Did anyone else notice that the 2 main backers of high FPS filmmaking are also principle owners in visual effects companies (Cameron - Digital Domain, Jackson - Weta)? Visual effects would presumably become much more expensive if you have to animate and render 2x as many frames so they both stand to make a lot more money if this takes off.
There's going to be telecine judder with both framerate videos at either refresh rate (should be worse for 48p if you do 60Hz). I'd test both to see which looks better. In theory 72Hz would be the better choice, but it matters what cadence your video player is generating.At someone elses house.
Since this LCD monitor can do 75hz and 60hz should I set it to 60?
Camera movement in 48fps looks great. When the camera is stationary things appear sped up but I'm assuming that eventually it will seem more natural than 24fps. Shooting The Hobbit in 48fps was a bold choice and one that I'm supportive of, regardless of whether I end up liking it more than 24fps or not.
Thanks bluerei. Great examples.
Exactly my thoughts. I actually find the 48fps to be more "movie-like" than "television-like" but it still feels a little speed up at first. The panning does look a bit smoother.
Wow. That new 48fps video is much better. A lot smoother in general, and just kind of freaky to watch. It looks both faster and slower at the same time. Some points it almost looks like it is in slow motion (when you drop the basket), but it is buttery smooth.
3D like that is going to be crazy.
Thank you for making that new one. A definite improvement over the original one.
In this thread... people form opinions on 48fps film watching samples on 60Hz monitors. Fantastic.
In this thread... people form opinions on 48fps film watching samples on 60Hz monitors. Fantastic.
In this thread... people form opinions on 48fps film watching samples on 60Hz monitors. Fantastic.
BTW, I don't think u understand how much I hate h.264 gamma boost. The original quality looks SOOOO much better. I could have done quicktime animation codec, but the file probably would have been around 2GB.
I hate h.264.
While telecine judder is being introduced (and it's more irregular for the 48p capture) ... you can still make some judgments based on other characteristics.In this thread... people form opinions on 48fps film watching samples on 60Hz monitors. Fantastic.
feels bad man.I watched all videos on my 240hz TV, feels good man.![]()
Games are not an accurate comparison since this is using a non-standard shutter speed. A game would not show you that.Yeah but... I've played enough games to know what doubling the fps does to image quality and smoothness. I don't need to see examples on a monitor that doesn't display what they're going for properly.
If you have a CRT, you could configure this to be an accurate comparison.Only the cinema will show the true difference.
A 240Hz TV is no different than a 120Hz or 60Hz TV for this comparison.I guess those with 240Hz tvs get closer, but it's still not the same experience.
Games are not an accurate comparison since this is using a non-standard shutter speed. A game would not show you that.
If you have a CRT, you could configure this to be an accurate comparison.
A 240Hz TV is no different than a 120Hz or 60Hz TV for this comparison.
feels bad man.
You computer is performing pulldown to convert the content to 60p (well it may not be for the 24p content depending on your settings) which introduces telecine judder.
The TV is simply repeating each frame 4 times. Well unless you have interpolation turned on, then it's fucking with it more.
Can you explain to me what's the difference between higher FPS and 120Hz? I understand the FPS aspect, what I'm not clear about is how Hz's work and how similar or completely different they are form FPS.
120 Hz is just the refresh rate of the monitor. It's a spec that only pertains to the hardware, not the source video.
It's increasing the FPS, though it's via an algorithm to approximate the data. Basically the new frames are a simulation. Think of it like scaling. It's extrapolating what the missing data should be, and isn't perfect.Pardon my ignorance, but when you enable the smooth feature that today's HDTV's offer, is that an actual increase in FPS or is it just an illusion?
Not reallyBtw... Making movies at a higher FPS (48 or 60) would change how movies are made in terms of speed in scenes, right?
I don't know the specifics of what AMD Vision is doing.So the 240hz profile on the AMD Vision meant nothing? Even with the deinterlacing and pull down options off?
Feels really bad now, total placebo mannn.
It's increasing the FPS, though it's via an algorithm to approximate the data. Basically the new frames are a simulation. Think of it like scaling. It's extrapolating what the missing data should be, and isn't perfect.
The algorithms and their settings vary quite a bit from TV to TV. Similarly, a director can vary the shutter speed and create vastly different results. So you really can't directly compare the two unless you actually know the specifics of the film and the specifics of what your TV's algorithm is doing.
Not really
24 looks much better as expected.
One thing that just occurred to me. Did anyone else notice that the 2 main backers of high FPS filmmaking are also principle owners in visual effects companies (Cameron - Digital Domain, Jackson - Weta)? Visual effects would presumably become much more expensive if you have to animate and render 2x as many frames so they both stand to make a lot more money if this takes off.
A 240Hz TV is no different than a 120Hz or 60Hz TV for this comparison.
Yeah but... I've played enough games to know what doubling the fps does to image quality and smoothness. I don't need to see examples on a monitor that doesn't display what they're going for properly.
Only the cinema will show the true difference.
I guess those with 240Hz tvs get closer, but it's still not the same experience.
I guess it matters what you're referring to.Awesome... Now I get it, thanks.
So there would be no adjustment whatsoever in how a director/editor directs & edit their scenes? So if a director wanted to make a fast movie with fast editing at 24fps, he would do the same while filming at 48fps? That would mean that higher FPS just improves smoothness rather than speed. Yes?
That would be true if the TV's actually inputted 48p. They currently don't.Really? I thought a 240Hz TV would be the only one where no pull down was necessary for 48fps (just show each frame five times). I don't pretend to be an expert on this though.
I don't know the specifics of what AMD Vision is doing.
The problem is you're hooking it up to a TV, which only inputs specific frequencies (60Hz, and I would assume 24Hz).
Awesome... Now I get it, thanks.
So there would be no adjustment whatsoever in how a director/editor directs & edits the scenes? So if a director wanted to make a fast movie with fast editing at 24fps, he would do the same while filming at 48fps? That would mean that higher FPS just improves smoothness rather than speed. Yes?
I guess it matters what you're referring to.
For example, sometimes tough action scenes are shot with a higher framerate and the actors purposely act out the scene slower ... then in editing they drop frames to bring it down to the normal framerate. If it's done correctly the speed ends up being what the director had intended (the action scene looks faster than it was performed).
For something like that, the same principal would apply ... they'd just have to use a framerate faster than 48 when capturing the scene.
In terms of general editing, CGI, stop motion, etc effects are certainly affected by higher framerates. You don't want a serious mismatch with how many frames are created for the effects vs the live footage, otherwise the effects will look obvious and your suspension of disbelief is broken.
An obvious example of this is movies that use stop motion mixed with live footage. Think of the end of the first Terminator when it's in the factory with no skin ... or the original Clash of the Titans ... etc.
Nope. Nothing would change. All it means is more frames per second of film. More data, less blur.
I unfortunately can't find much info on AMD Vision profiles. If you find any links to info, I could take a look and explain what it appears they're doing.The tv is a Sony HX925, it says it has a 960hz but it's only a true 240hz, but connected to the living room computer through hdmi, it recognizes as a tv panel, and it allows me to set its video profile at 1080p@240hz.
Quite a fall knowing it is not refreshing at this rate (connected to the computer only, I assume), and it shows how much of a placebo it was (to me), since I perceived a non existent difference.![]()
For those scenarios though, the principle would be the same.Yeah, I figured that many scenes in movies where shot slower than what they ended up being when the movie was finished. I imagined that some adjustments would have to be made when filming at higher fps.
For those scenarios though, the principle would be the same.
They calculate what percentage faster the framerate needs to be based on how slow they plan to act it out. The percentage will be the same regardless of the framerate. For example, if they plan to act something out at half the speed they want it to appear in the movie, they will need to film it at double the framerate ... regardless of what the original framerate is.