• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Netflix Irishman de-aging tech vs deepfake

That shit was so creepy I couldn't pay attention to the movie. I stopped watching when Al Pacino was introduced.
 
Why arent studios using this tech? It's almost like the vfx studios are unionized and don't want an "easy button."

It learns off of a library of recordings of that person right? Maybe getting the rights all those films to build that up is prohibitively expensive?
 
It learns off of a library of recordings of that person right? Maybe getting the rights all those films to build that up is prohibitively expensive?
Producers have studios they like to work with to handle things. Most likely ML based technologies were not even a consideration; the traditional training of FX artists nowadays doesn't include that knowledge. Machine learning technologies applied to FX's are still a breakthrough and most FX studios haven't adopted them.

Yeah I mean. Imagine the layoffs also as good as deep fake is you will need someone to finesse it

That is a myth. New technologies enable new markets. The artists that don't adapt will lose their jobs, while those that do, will now be in high demand.
 
Last edited:
I still don't like hollywoods de-aging technique.
While it got a lot better over time there is still always something off. Particularly the eyes look kind of dead and glassy.

And while deepfakes can look really good, they still look too soft and mushy and would stick out too much in a 4k production.



On a sidenote:
I discovered those George Lucas Deepfakes on the Collider channel yesterday and am laughing my ass off since. It's fucking hilarious.
Comedy looks to be an amazing usecase for Deepfakes :D


 
I mean, you can tell the difference. The deepfake looks super smooth. At least with the de-aging you can see pores and skin wrinkles and stuff. Either way, none of it matters much to me, I didn't look for it too much when watching.
 
Why arent studios using this tech? It's almost like the vfx studios are unionized and don't want an "easy button."
that's not how VFX works. Deepfakes are far from perfect as you can easily tell that something's off. For short low res tiktok clips, it maybe fine, but at 4k res you can easily spot it. Furthermore, if the camera is not static you'd have to track the camera and matchmove the head, or else the cg face will be "floaty".

and vfx studios unionized? that's a good one lol. MPC, the studio responsible for the VFX for the film Cats, shut down its Vancouver office and canned everyone weeks before christmas. If anything, unionizing is good thing for the industry.
 
Last edited:
The physical violence scenes were absolutely jarring like watching an abusive old grandfather trying to get after it way past his prime.

I really liked it but I also think it shows Scorsese is limited in creativity.

Cant top goodfellas
 
Last edited:
that's not how VFX works. Deepfakes are far from perfect as you can easily tell that something's off. For short low res tiktok clips, it maybe fine, but at 4k res you can easily spot it. Furthermore, if the camera is not static you'd have to track the camera and matchmove the head, or else the cg face will be "floaty".

and vfx studios unionized? that's a good one lol. MPC, the studio responsible for the VFX for the film Cats, shut down its Vancouver office and canned everyone weeks before christmas. If anything, unionizing is good thing for the industry.

The Netflix deaging looks far worse imo. Super unnatural considering the budget vs a program.
 
I don't really think the deepfake looked better. That said, I also had to watch the movie a 2nd time because of how distracted I was by the de-aging shit. And I still had to force myself to stop staring at their Smeagol mouths. The mouths just ruined it for me.

I think maybe in real life I look at mouths to kinda lip-read while listening to a voice or something. Can't really explain it. All I know is it's a fucking awesome movie that I hope gets a new coat of paint in 5 or 10 years.
 
The deepfake ones are blurry as fuck, though. Even more noticable and jarring than Netflix's solution because of that.
 
I think the Netflix version looks better actually. It's like comparing cutscenes from current gen and previous gen consoles, it has higher resolution details and less dead eyes. Feels more realistic and alive in general.
It's still impressive that random people can emulate the work of pros, but there's still some way to go (also one can assume that they selected the "good sequences" where the deep learning method worked fine, and discarded all of those that felt wonky... something that the pros who handled the real movie couldn't do)
 
I don't really think the deepfake looked better. That said, I also had to watch the movie a 2nd time because of how distracted I was by the de-aging shit. And I still had to force myself to stop staring at their Smeagol mouths. The mouths just ruined it for me.

I think maybe in real life I look at mouths to kinda lip-read while listening to a voice or something. Can't really explain it. All I know is it's a fucking awesome movie that I hope gets a new coat of paint in 5 or 10 years.
If they haven't updated the original Shrek, I don't think this will ever receive a visual update.
 
That is a myth. New technologies enable new markets. The artists that don't adapt will lose their jobs, while those that do, will now be in high demand.
for now, the day we get agi, is the day the massive layoffs really start
I don't really think the deepfake looked better. That said, I also had to watch the movie a 2nd time because of how distracted I was by the de-aging shit. And I still had to force myself to stop staring at their Smeagol mouths. The mouths just ruined it for me.
The deepfake ones are blurry as fuck, though. Even more noticable and jarring than Netflix's solution because of that.
That deepfake was likely done with a small amount of compute power. Using more compute power time and h/w it will likely be able to be done with higher resolution.
 
That deepfake was likely done with a small amount of compute power. Using more compute power time and h/w it will likely be able to be done with higher resolution.
The way Deepfake works is that it uses a source and the source scenes used here are clearly pretty old and of lower quality than the Netflix movie. I guess maybe if a Hollywood studio did it, they might have access to higher rez footage, or can remaster the source first.
 
The way Deepfake works is that it uses a source and the source scenes used here are clearly pretty old and of lower quality than the Netflix movie. I guess maybe if a Hollywood studio did it, they might have access to higher rez footage, or can remaster the source first.
Many old films are said to have source quality above 4k due to being on film and not digital.
 
You have to remember that the deepfake was applied to the already de-aged model. It would be more interesting to see what it could do from scratch.
 
Deepfake looked legit better because it more accurately portrayed their target age.

One thing the films should have absolutely done was use younger stunt doubles for the more physical scenes.
 
Maybe a fined tuned combination of the two would look the best because whatever they're doing just isn't working. The entire movie was just jarringly bad for me because of the technology. At least it's better than Tron Legacy, but that was 10 years ago.
 
Top Bottom