Skip to Content

How soon until we can’t trust video?

zzzzzzz9999777666

How close is the world to never being able to trust a video recording anymore? It may be closer than you think. The technology to alter the faces and the voices of anyone on a video is already here and anyone can do it by manipulating some software with a cheap webcam.

The technology is known as Face2Face and it is quickly being developed at a pair of  German research centers as well as at Stanford University. The aim, it seems, of this technology is “to animate the facial expressions of the target video by a source actor and re-render the manipulated output video in a photo realistic fashion.”

So, what does all of that Academia/Politician speak mean, anyway? What it means is that anyone can replace the voice and the facial movements of anyone on a video and have them act and talk the way you, or some other third party decides. The inventors state that facial tracking software replicates a face to be transferred and the transfer of voice and new facial images are strictly driven by the data that is fed into the software.

The inventors further state that it is the RGB-only technology that makes a difference. It will even make the inside of the target’s mouth look exactly as it had before and thus matching the outside of the mouth. Basically, to pretty much everyone, it will look like the person only they are not saying what they had actually said in the original, and real, video.

There are project videos that can be seen at YouTube of these creators dubbing the voices and the facial expressions of some well known people. It seems seamless and relatively simple to do. Prior to this new technology, the ability to do it was there but the skill set and equipment needed was highly specialized. Now, once this product gets released, a first grader could be able to do it.

The creators say it will have all sorts of wonderful applications for gaming and for film production as well as certain business purposes such as teleconferencing. The technology is still not perfect, however. The extreme reliance on face mapping gets thrown off if the head or face moves or is obscured somehow. Hair and hands tends to throw it off also. But, they say, they are working on that…

While all of this seems to be science just for the sake of science, it has been speculated by some, especially the conspiracy historians, that there could be much more nefarious and ominous uses for such technology. Will there come a time when video manipulation everywhere will be the final video Photoshop like blow that dooms any sort of credibility for anyone or anything, anywhere ever again?

PHOTO CREDIT: Pixabay