Even if you legally mandate that the software makes the videos identifiable as fake you're not going to be able to enforce it on a global scale forever. Eventually someone somewhere will develop their own tech outside of the law if they have a big enough incentive to do so.The programs that are used in the future to produce these animations will need to have a rock solid system of verification, say, like image processing programs are protected against money falsification, so that these technologies cannot be used to spread misinformation. Either watermarks, or protection against the use of faces of politicians and celebrities, or some kind of encrypted metadata in the files they produce, or simply a telltale sign that the image is artificial.
Then we'll have to build AI's to expose fake videos.so it wont be long until we can have video proof of anybody saying anything we like
this will end well
Very dangerous.
Reminds me of The Running Man when ICS faked Arnold's death because he kept killing stalkers.
The programs that are used in the future to produce these animations will need to have a rock solid system of verification, say, like image processing programs are protected against money falsification, so that these technologies cannot be used to spread misinformation. Either watermarks, or protection against the use of faces of politicians and celebrities, or some kind of encrypted metadata in the files they produce, or simply a telltale sign that the image is artificial.
People will just claim the verification AI results are fake if they don't want to believe them.Then we'll have to build AI's to expose fake videos.
Actually, something similar is already being done in GAN image generation models, where one agent tries to guess which images are real and which have been generated by another AI agent in a sort of endless tug of war.
No.CG character animation in movies is gonna be a lot easier now, right?
This technology has truly frightening political implications as it becomes more affordable. Sure Hollywood can sort of do that now, but as it becomes cheaper and easier to do, I really don't like where that could go in the wrong hands. Especially with today's political climate.
Historians in the future are going to have a hell of a time.
My grandkids are going to have it rough as hell.
Disinformation up the ass incoming.
They'll be able to talk to "you" long after you died.
Can't wait till our first completely CG president.
Fake news is gonna reach new highs.
Fake news is gonna reach new highs.
...am I the only person who thinks it looks totally fucking fake?
...am I the only person who thinks it looks totally fucking fake?
In a close up it's pretty obvious that it's fake.
However, treated as security camera footage or degraded in some other ways it would probably be hard to prove one way or another.