WHY THIS MATTERS IN BRIEF
When you can no longer trust the information you see, and when technology lets you create fake people and fake content, the fabric of society starts to unravel.
Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trends, connect, watch a keynote, or browse my blog.
As we continue to see emerging technologies such as Artificial Intelligence (AI) help people create DeepFakes and other kinds of synthetic content increasingly these tools are being used for both fun and also nefarious purposes with different people trying to use them to sow misinformation, discredit people and countries, and polarise societies.
In the latest examples of the weaponization f deepfakes videos recently published online appeared to show the presidents of Russia and Ukraine issuing major statements about the Ukraine war. But the videos were quickly identified by experts as fake and removed from social media.
The Future of Synthetic Content and DeepFakes, by Keynote Speaker Matthew Griffin
A deepfake is a video designed to fool people into thinking that is “real.” People who create deepfakes use different technology tools and methods to make people appear to say things they never said.
Earlier this month, a video was shared on Twitter that appeared to show Russian President Vladimir Putin declaring that a peace agreement had been reached with Ukraine. But the video was from a speech Putin gave on February 21. The video had been manipulated by replacing Putin’s voice with new synthetic audio which is another up ticking trend.
In the speech, Putin appears to say, “We’ve managed to reach peace with Ukraine. He also goes on to announce that Crimea – which Russia seized from Ukraine in 2014 – would once again become an independent republic inside Ukraine.
See the bad fake for yourself
Experts from Reuters news agency and other organizations said their examinations of the video showed that it matched one that was published by Putin’s official website. In that video, Putin was speaking about the situation in Ukraine before Russia invaded.
Meanwhile, in another deepfake, Ukrainian President Volodymyr Zelenskyy appeared to urge people in his country to give up their weapons and surrender to Russia’s invading forces. In that case, both the video and audio had been manipulated in an effort to make it seem like Zelenskyy was offering a surrender.
Both of the videos were widely shared online. Many social media users immediately criticized the quality of the Zelenskyy video. They pointed out that Zelenskyy’s skin color looked different and that he spoke with an unusual accent. Some people also noted that the video quality was not sharp around his head and face.
Zelenskyy’s video was also identified as fake by multiple media organizations and fact-checking websites. Nina Schick, writer of the book Deepfakes, told Reuters the video looked like a “terrible face swap.” A face swap is a digital method that puts one person’s face onto another person’s body.
Television station Ukraine24 said in a Facebook message that “enemy hackers” had published the video on its website. The station removed the video, as did Facebook, YouTube and other social media organizations.
Later, Ukraine’s Ministry of Defense released a video from Zelenskyy in which he denounced the deepfake. He called the attempt a “childish provocation.”
The head of security policy at Facebook’s parent Meta, Nathaniel Gleicher, reacted to the video on Twitter. “Earlier today, our teams identified and removed a deepfake video claiming to show President Zelensky issuing a statement he never did,” he wrote.
Gleicher added that any similar videos linked to the Ukraine war or other subjects would be removed from Meta’s social media services for violating company policies covering misinformation.
Twitter says it is also seeking to limit the spread of false information and content about the Ukraine war. The company’s Vice President of Global Public Policy, Sinéad McSweeney, said those efforts had already resulted in the removal of more than 50,000 pieces of content found to be false or misleading. Among these were old videos of past conflicts presented as if they were taking place in Ukraine.
Sam Gregory is an expert on technology and disinformation for the non-profit group WITNESS. He told Reuters there have been numerous uses of deepfakes so far related to the war in Ukraine. Gregory said some aim to spread misinformation, while others are attempts to present parody.
He warned, however, that there can be a fine line between disinformation and parody, depending on how effective a deepfake is at fooling the public.
On Twitter, he called the Zelenskyy deepfake a “best case” scenario. He said this is because the video quality was easily recognized as poor and the Ukrainian president was able to quickly publish a real video explaining that it was fake. But in the next few years you won’t be able to tell the difference as the technology gets better which makes the development of new detection tools like these incredibly important.