Why arent studios using this tech? It's almost like the vfx studios are unionized and don't want an "easy button."
Yeah I mean. Imagine the layoffs also as good as deep fake is you will need someone to finesse itWhy arent studios using this tech? It's almost like the vfx studios are unionized and don't want an "easy button."
Producers have studios they like to work with to handle things. Most likely ML based technologies were not even a consideration; the traditional training of FX artists nowadays doesn't include that knowledge. Machine learning technologies applied to FX's are still a breakthrough and most FX studios haven't adopted them.It learns off of a library of recordings of that person right? Maybe getting the rights all those films to build that up is prohibitively expensive?
Yeah I mean. Imagine the layoffs also as good as deep fake is you will need someone to finesse it
that's not how VFX works. Deepfakes are far from perfect as you can easily tell that something's off. For short low res tiktok clips, it maybe fine, but at 4k res you can easily spot it. Furthermore, if the camera is not static you'd have to track the camera and matchmove the head, or else the cg face will be "floaty".Why arent studios using this tech? It's almost like the vfx studios are unionized and don't want an "easy button."
that's not how VFX works. Deepfakes are far from perfect as you can easily tell that something's off. For short low res tiktok clips, it maybe fine, but at 4k res you can easily spot it. Furthermore, if the camera is not static you'd have to track the camera and matchmove the head, or else the cg face will be "floaty".
and vfx studios unionized? that's a good one lol. MPC, the studio responsible for the VFX for the film Cats, shut down its Vancouver office and canned everyone weeks before christmas. If anything, unionizing is good thing for the industry.
If they haven't updated the original Shrek, I don't think this will ever receive a visual update.I don't really think the deepfake looked better. That said, I also had to watch the movie a 2nd time because of how distracted I was by the de-aging shit. And I still had to force myself to stop staring at their Smeagol mouths. The mouths just ruined it for me.
I think maybe in real life I look at mouths to kinda lip-read while listening to a voice or something. Can't really explain it. All I know is it's a fucking awesome movie that I hope gets a new coat of paint in 5 or 10 years.
for now, the day we get agi, is the day the massive layoffs really startThat is a myth. New technologies enable new markets. The artists that don't adapt will lose their jobs, while those that do, will now be in high demand.
I don't really think the deepfake looked better. That said, I also had to watch the movie a 2nd time because of how distracted I was by the de-aging shit. And I still had to force myself to stop staring at their Smeagol mouths. The mouths just ruined it for me.
That deepfake was likely done with a small amount of compute power. Using more compute power time and h/w it will likely be able to be done with higher resolution.The deepfake ones are blurry as fuck, though. Even more noticable and jarring than Netflix's solution because of that.
The way Deepfake works is that it uses a source and the source scenes used here are clearly pretty old and of lower quality than the Netflix movie. I guess maybe if a Hollywood studio did it, they might have access to higher rez footage, or can remaster the source first.That deepfake was likely done with a small amount of compute power. Using more compute power time and h/w it will likely be able to be done with higher resolution.
Many old films are said to have source quality above 4k due to being on film and not digital.The way Deepfake works is that it uses a source and the source scenes used here are clearly pretty old and of lower quality than the Netflix movie. I guess maybe if a Hollywood studio did it, they might have access to higher rez footage, or can remaster the source first.
You have to remember that the deepfake was applied to the already de-aged model. It would be more interesting to see what it could do from scratch.