The Hobbit 48fps first impressions

Status
Not open for further replies.
LOL Cameron, with Avatar being 3D and Avatar 2 being 60p he is incessantly trying to fuck up the blu-ray consortium's standards.

Averaging 28.8mbps, the Avatar 1 2D theatrical cut takes up 44.9gb of a 50gb dual layer blu-ray disc.

If Avatar 2 was somehow identical in runtime, and the home release was encoded at the exact same video bitrate, but was 3D and 60p, it would take up 224.5gb.
Not that simple. h.264 works by finding similar information between frames. And the information for the left eye and right eye is mostly the same. This is why MVC encoded 3D blu rays dont need to be 100gb to retain the same image quality of 2D.
http://en.wikipedia.org/wiki/Multiview_Video_Coding

Can blu ray players be updated to support 48fps and will 50gb be enough to support 48fps movies? I dunno.
 
Christ almighty there is so much misinformation in this thread it makes me ill.

Frame rates on games and frame rates on movies have absolutely 100% nothing to do with each other. Movies are optical, where motion blur is dictated by the amount of time the sensor or film stock is exposed to light by a combination of frame rate and shutter speed/angle. Games are generating frames on the fly by a video card, with motion blur controlled in no way shape or form by the frame rate. Seriously guys they have nothing at all to do with each other. 60 FPS gaming ≠ 60 FPS movies.

If you want to know what something filmed and displayed at 60FPS looks like then just watch 90% of TV produced from the early 80s through the late 90s. That was almost all 60fps shown on a 60hz TV. Did all of you "60FPS IS THE BEST!!!" people love the way late 80s TV looked?

The frame rate cannot be fully discussed without also talking about the shutter speed/angle. Movies are typically shot at 24FPS with a 180 degree shutter, which means that each frame was exposed to light for 1/48th of a second. Movies like Saving Private Ryan and 25th Hour played with the shutter angle to each frame exposed for less time, there for reducing motion blur and making action "sharper." Peter jackson is filming at 48FPS, but we have no idea what shutter speed he is using (someone said 270 degrees, but I have not seen that confirmed anywhere and do they mean 270 degrees of the shutter were closed or open). If he is using 270 degrees of the shutter open (each frame exposed to light for 1/72 of a second) then that would reintroduce a lot of the motion blur which would have otherwise been removed by shooting at 48FPS with a 180 degree shutter (each frame only exposed to light for 1/96th of a second).

To those saying, well I will convert 48FPS footage to 24FPS, it will still look off in its motion because of the motion blur issues. I just was at NAB and I saw 4K footage shot at 60FPS conformed to 24FPS (without using it for slo-mo) and believe me I could tell exactly which shots were 60FPS conformed vs 24FPS native. It did not look nearly as good as the stuff which was shot at 24FPS. And this was all shot by Jeff Cronenweth, David Fincher's cinematographer.

My suspicion is this will look much more like "video" than what we typically think of as looking like "film" and that is a bad thing. I don't go to the movies for things to look real. I go to the movies to be told a story. I make movies, I know how fake everything is. I don't want to see that fakeness up on the screen.
 
Not that simple. h.264 works by finding similar information between frames. And the information for the left eye, and right eye is mostly the same. This is why MVC encoded 3D blu rays dont need to be 100gb.
http://en.wikipedia.org/wiki/Multiview_Video_Coding
To some extent, 48p content has performance gains as well. The higher the framerate, the less difference between frames, the more you can compress. Just how it compares to 3D on average though, I don't know. I think obviously it won't compress as much, but then again 50GB is likely enough.
Can blu ray players be updated to support 48fps and will 50gb be enough to support 48fps movies? I dunno.
maybe
 
By the time Avatar 2 comes out we'll be using holodiscs or some bullshit so it won't matter then. :lol





I was talking in reference to including both a 24fps and a 48fps version of the movie on the same disc. And this movie is 3D to boot...and blu-ray only supports high-framerate(60fps) at 720p anyway. I mean this would have to be one crazy firmware update, I doubt the hardware in most blu-ray players would support something like this. Yeah I just don't see this happening.

I would say they should just ship it on a 250gb USB HDD designed as some prop from the movie, but that's completely impractical and only I would buy such a thing. A 7200rpm hard disk is more than fast enough to transfer at 200-250 megabits/second.

But yeah... HOLO DISKS. Avatar 2 isn't gonna have a home release until 2016.
 
To those saying, well I will convert 48FPS footage to 24FPS, it will still look off in its motion because of the motion blur issues. I just was at NAB and I saw 4K footage shot at 60FPS conformed to 24FPS (without using it for slo-mo) and believe me I could tell exactly which shots were 60FPS conformed vs 24FPS native. It did not look nearly as good as the stuff which was shot at 24FPS. And this was all shot by Jeff Cronenweth, David Fincher's cinematographer.

The Hobbit trailer looks perfectly fine at 24fps
 
Christ almighty there is so much misinformation in this thread it makes me ill.

Frame rates on games and frame rates on movies have absolutely 100% nothing to do with each other. Movies are optical, where motion blur is dictated by the amount of time the sensor or film stock is exposed to light by a combination of frame rate and shutter speed/angle. Games are generating frames on the fly by a video card, with motion blur controlled in no way shape or form by the frame rate. Seriously guys they have nothing at all to do with each other. 60 FPS gaming ≠ 60 FPS movies.

If you want to know what something filmed and displayed at 60FPS looks like then just watch 90% of TV produced from the early 80s through the late 90s. That was almost all 60fps shown on a 60hz TV. Did all of you "60FPS IS THE BEST!!!" people love the way late 80s TV looked?

The frame rate cannot be full discussed without also talking about the shutter speed/angle. Movies are typically shot at 24FPS with a 180 degree shutter, which means that each frame was exposed to light for 1/48th of a second. Movies like Saving Private Ryan and 25th Hour played with the shutter angle to each frame exposed for less time, there for reducing motion blur and making action "sharper." Peter jackson is filming at 48FPS, but we have no idea what shutter speed he is using (someone said 270 degrees, but I have not seen that confirmed anywhere and do they mean 270 degrees of the shutter were closed or open). If he is using 270 degrees of the shutter open (each frame exposed to light for 1/72 of a second) then that would reintroduce a lot of the motion blur which would have otherwise been removed by shooting at 48FPS with a 180 degree shutter (each frame only exposed to light for 1/96th of a second).
IIRC, most 'video' for TV usage was actually shot at 30p? Agree with everything else.

To those saying, well I will convert 48FPS footage to 24FPS, it will still look off in its motion because of the motion blur issues. I just was at NAB and I saw 4K footage shot at 60FPS conformed to 24FPS (without using it for slo-mo) and believe me I could tell exactly which shots were 60FPS conformed vs 24FPS native. It did not look nearly as good as the stuff which was shot at 24FPS. And this was all shot by Jeff Cronenweth, David Fincher's cinematographer.

My suspicion is this will look much more like "video" than what we typically think of as looking like "film" and that is a bad thing. I don't go to the movies for things to look real. I go to the movies to be told a story. I make movies, I know how fake everything is. I don't want to see that fakeness up on the screen.
It's also possible they could add blur in post for the 24p release. Shouldn't be all that hard to do fairly accurately, but who knows if they will.
 
Another thing is that they are shooting The Hobbit with an open shutter, allowing the option for 24fps like they did with the trailer. An open shutter at 48p is the same as a 180 degree shutter at 24p, giving that cinematic feel.
 
The Hobbit trailer looks perfectly fine at 24fps
It's fairly close, but scenes with motion in them look a bit more 'crisp' for lack of a better term. Since they're using a 270 degree shutter on the Hobbit, I'm curious what a 360 degree shutter will look like at 48fps.


Another thing is that they are shooting The Hobbit with an open shutter, allowing the option for 24fps like they did with the trailer. An open shutter at 48p is the same as a 180 degree shutter at 24p, giving that cinematic feel.
99.9% sure it's 270 degrees. If they had used 360, you'd be right.
 
Christ almighty there is so much misinformation in this thread it makes me ill.

Frame rates on games and frame rates on movies have absolutely 100% nothing to do with each other. Movies are optical, where motion blur is dictated by the amount of time the sensor or film stock is exposed to light by a combination of frame rate and shutter speed/angle. Games are generating frames on the fly by a video card, with motion blur controlled in no way shape or form by the frame rate. Seriously guys they have nothing at all to do with each other. 60 FPS gaming ≠ 60 FPS movies.

If you want to know what something filmed and displayed at 60FPS looks like then just watch 90% of TV produced from the early 80s through the late 90s. That was almost all 60fps shown on a 60hz TV. Did all of you "60FPS IS THE BEST!!!" people love the way late 80s TV looked?

The frame rate cannot be fully discussed without also talking about the shutter speed/angle. Movies are typically shot at 24FPS with a 180 degree shutter, which means that each frame was exposed to light for 1/48th of a second. Movies like Saving Private Ryan and 25th Hour played with the shutter angle to each frame exposed for less time, there for reducing motion blur and making action "sharper." Peter jackson is filming at 48FPS, but we have no idea what shutter speed he is using (someone said 270 degrees, but I have not seen that confirmed anywhere and do they mean 270 degrees of the shutter were closed or open). If he is using 270 degrees of the shutter open (each frame exposed to light for 1/72 of a second) then that would reintroduce a lot of the motion blur which would have otherwise been removed by shooting at 48FPS with a 180 degree shutter (each frame only exposed to light for 1/96th of a second).

To those saying, well I will convert 48FPS footage to 24FPS, it will still look off in its motion because of the motion blur issues. I just was at NAB and I saw 4K footage shot at 60FPS conformed to 24FPS (without using it for slo-mo) and believe me I could tell exactly which shots were 60FPS conformed vs 24FPS native. It did not look nearly as good as the stuff which was shot at 24FPS. And this was all shot by Jeff Cronenweth, David Fincher's cinematographer.

My suspicion is this will look much more like "video" than what we typically think of as looking like "film" and that is a bad thing. I don't go to the movies for things to look real. I go to the movies to be told a story. I make movies, I know how fake everything is. I don't want to see that fakeness up on the screen.

Your rant ends here mister! :P
 
It's fairly close, but scenes with motion in them look a bit more 'crisp' for lack of a better term. Since they're using a 270 degree shutter on the Hobbit, I'm curious what a 360 degree shutter will look like at 48fps.



99.9% sure it's 270 degrees. If they had used 360, you'd be right.

Yeah it's 270 for flickering reasons.
 
It's fairly close, but scenes with motion in them look a bit more 'crisp' for lack of a better term. Lacks a bit of the familiar motion blur. Since they're using a 270 degree shutter on the Hobbit, I'm curious what a 360 degree shutter will look like at 48fps.

I am also curious about that. Might be a best of both worlds scenario because you would be exposing the image for 1/48th of a second just like traditional film but you would lose a lot of the fast pan judders by having a higher framerate. Can you link where you found they were shooting with 270 degree?
 
a Master Ninja said:
Motion interpolation shit doesn't add detail or clarity, it just blends frames in attempt to make motion look smoother.
While that's true, human brain is quite capable of "temporal-upsampling" (and in fact we do it all the time), ie. combining information from multiple "frames" into more perceived details, that aren't there in any "single frame" (you can easily observe this in real-world, as well as when watching video).

While I don't know if there's research to prove it - it does seem possible for interpolated frames adding more brain-perceived detail in some cases too.
 
Most of it was shot at 60i actually until progressive cameras became more common place in the mid to late 90s.
That's not true of analog video. It's not like it was capturing interlaced images to tape.

Obviously it was broadcast and displayed as 60i until recently though.





While that's true, human brain is quite capable of "temporal-upsampling" (and in fact we do it all the time), ie. combining information from multiple "frames" into more perceived details, that aren't there in any "single frame" (you can easily observe this in real-world, as well as when watching video).

While I don't know if there's research to prove it - it does seem possible for interpolated frames adding more brain-perceived detail in some cases too.
Another long-pole is for people using sample-and-hold displays (LCD, etc). When they increase the framerate, they are reducing the hold time which improves temporal resolution.

The longer and and image is held, the longer it 'burns in' to our retina for lack of a better term. There are a lot of complex interactions between our brain, hold times, and perception of motion.
 
Would still like to see samples with an open of 360 degree shutter
Yeah, I edited my post.


In principal at least, he's doing what I had hoped ... opening up the shutter some more so it will be somewhat of a balance between temporal resolution and some pleasing 'cinematic' blur.

Whether that specific shutter angle is optimal or not remains to be seen. I'm just glad he wasn't going default, as I suspect that would look a bit too uncanny valley.

Half a page without anyone saying 60fps in a game is good so 60fps in a movie must be good too. Congrats guys!
Why was that included in a reply to me?

*knocks over projector*
 
So even if I go to a digital theater, hell even if I watch the thing in 3D I won't be able to know if I'll be paying for the 24fps or 48fps version?

This is Hollywood -- land of Surround Sound, Technicolor, Dolby Digital, DTS, IMAX, etc, etc. I think they are going to come up with some kind of marketing word for 48fps that they can plaster onto a marquee (and probably charge you an extra $1 on your ticket for).
 
:p







What I need to research is whether any of the Sony 4K projectors that support 48p are available around me.

I don't even know how to go about that right now. I was gonna ask about it, but then I decided it's probably best to wait closer to the film because theaters may wait upgrade later this year. I read software updates and such can be used for certain things.
 
Frame rates on games and frame rates on movies have absolutely 100% nothing to do with each other. Movies are optical, where motion blur is dictated by the amount of time the sensor or film stock is exposed to light by a combination of frame rate and shutter speed/angle. Games are generating frames on the fly by a video card, with motion blur controlled in no way shape or form by the frame rate. Seriously guys they have nothing at all to do with each other. 60 FPS gaming ≠ 60 FPS movies.

What are you talking about, of course they're directly comparable, computer games are mimicing what a real camera does. Granted the motion blur is not perfect, but that's because it only has limited resources. Are you going to tell me Pixar movies aren't really 24FPS? Computer games are just primitive versions of the same techniques.

If you want to know what something filmed and displayed at 60FPS looks like then just watch 90% of TV produced from the early 80s through the late 90s. That was almost all 60fps shown on a 60hz TV. Did all of you "60FPS IS THE BEST!!!" people love the way late 80s TV looked?

That's 60i standard def recorded on analogue video, there's a cavern of difference between that and what we're talking about with the Hobbit.

My suspicion is this will look much more like "video" than what we typically think of as looking like "film" and that is a bad thing. I don't go to the movies for things to look real. I go to the movies to be told a story. I make movies, I know how fake everything is. I don't want to see that fakeness up on the screen.

I said this earlier in the thread, the main reason it looks like this is because of the jump from film on the Lord of the Rings films to digital. 48FPS we haven't really seen except for in computer games. It's totally valid to talk about how that improves the visuals.
 
While I agree with many that 24fps feels more cinematic, I also believe this in large part due to conditioning.

Take a comment like "Looking cheap like soap opera or made for TV". What if all quality films were at 60fps and all low budget had low frame rate....I can almost guarantee to a person the opposite argument would be made.

We have a transition period. Surround sound had a transition period, gimmicks that took you out of movie to now much better use (usually) to enhance immersion.

I thought 3D would get there as well...it seems the technological challenges and people's fun factor of the gimmicky "pop out at you" has dragged this out. Personally I can't stand 3D in the current limits...dimness, collapsed viewing angle, inability to work well outside of the sweet spot, etc.

48fps is different than these - it is about a conditioned response. I haven't seen it in person, I will probably not like it at first, but have every expectation that this should logically lead to better experiences once we break the conditioning
 
What are you talking about, of course they're directly comparable, computer games are mimicing what a real camera does. Granted the motion blur is not perfect, but that's because it only has limited resources. Are you going to tell me Pixar movies aren't really 24FPS? Computer games are just primitive versions of the same techniques.

No they are not at all. Pixar does everything they can to make their movies look like they were actually shot and they spend literally days to render the motion blur into each frame to make it look like film. Game motion blur is not at all dictated by the frame rate. Graphics ≠ optics.


That's 60i standard def recorded on analogue video, there's a cavern of difference between that and what we're talking about with the Hobbit.

True but that was to address the 60FPS is always better crowd. Late 80s television proves that it is, in fact, not always better.


I said this earlier in the thread, the main reason it looks like this is because of the jump from film on the Lord of the Rings films to digital. 48FPS we haven't really seen except for in computer games. It's totally valid to talk about how that improves the visuals.

Not true at all. There have been a ton of digital movies which do not look fake in the way people have been describing The Hobbit. Remember that the big advance for Digital was the Varicam whose main selling point was its ability to shoot 24FPS so that things would look like movies. See also indie film makers flocking to the Panasonic DVX100 when it was released because it shot 24FPS. Digital vs film has a lot of impact on the dynamic range, and color saturation of the image, but frame rate also has a massive impact on the aesthetic. I promise you could show me 24FPS footage with some motion in it and 30 FPS footage and I could immediately tell you which is which. In fact I do it all the time when the film students I work with mess up and film something at the wrong frame rate.
 
Who says higher = better?

It's simple math, mcfrank.

It is pretty simple indeed.

Why is it better? We don't see in frames. We are prepared to a boatload of information processing. Lower frame rates are just 'fooling' us.

Of course, there is a point where it would be perceptually insignificant the amount of unique frames compared to the cost and work to achieve it.
If they (Cameron) think 60 frames is cost worthy compared to 30 on digital, might as well do it. Frame wise only, there is no advantage in animation in lesser frames.
All this talk about exposure and shutter angle and motion blur are technical choices that other than 'mechanical sync' aren't deterrents to quality (again :P).
 
Status
Not open for further replies.
Top Bottom