Deep Video Portraits – a new level of manipulation based on artificial intelligence

Deep Video Portraits – a new level of manipulation based on artificial intelligence

Scientists have taught artificial intelligence to create fake, extremely realistic footage of famous people. Thanks to the developed system, the facial expressions and head movements of the substituted person were superimposed on pre-existing online source recordings e.g. celebrities or politicians. The effect was electrifying and stirred up a lot of controversy.

Scientists whooThose who developed the technology were themselves horrified by their feat. Previously, only facial expressions could be manipulated with the appropriate software. The results were interesting, though not entirely convincing. It was easy to ripoDistinguish the fake footage from the original. The new technique knocks the transfer of trojdimensional images of head movement, facial and eye expression and blinking from a filmed face to another person, ktothe footage can be found online.

The program, called Deep Video Portraits, was developed on the basis of earlier learning algorithmsow. The result was a surprisingly realistic image, full of detail and subtlety. The program picks up even a subtle nod or shrug of the shoulders.

What’s more, the new technique leaves behind far fewer distortions. This makes it more difficult to possibly detect forgery. The videos are so smooth that people surveyed were unable to see any manipulation of the footage and claimed it was real.

Would you be able to catch a fake? See in the video below:

New research by scientistsow from Stanford University are to be presented this summer at the SIGGRAPH conference on filmmakingoin VR. Researchers believe the technology can be put to practical use. For example, filmmakers could use it to make changes in an actor’s facial expressionsow already after filming. The program can also be used in dubbing.

Still, the technology is stirring up quite a bit of controversy. Politicians and researchers are concerned that the program will allow the creation of new, extremely dangerous and realistic fake newsow.

"Unfortunately, despite its positive sides, such technology could be abused. The now-modified recordings leave many tracesoin, thanks to whichorym it is possible to figure out counterfeiting. It’s hard to predict when fake videos will be unrotional from real recordings" – wrote the researchers.

impy-lefilm-adm

impy-lefilm-adm