How to shake the fakes out of politics
When the Deep Fake technology first appeared in early 2018, it was used to put famous faces on the body of porn performers and produce reasonably convincing videos.
But some fear that Deep Fakes will soon serve a much darker agenda.
"There's going to be a big wave of Deep Fakes coming our way," said Fabrice Pothier, a spokesman for the Transatlantic Commission on Election Integrity that was set up to combat the growing amount of interference in regional and national elections.
Backed by former US vice-president Joe Biden and a raft of former politicians and senior figures from Nato and other bodies, the commission plans to produce tools to help elections progress without interference.
One tool will target Deep Fakes - especially those made to put words in the mouths of politicians or other public figures involved with elections.
Time is running out to develop such tools. said John Gibson, from ASI Data Science which has been advising the commission about ways to spot Deep Fake videos.
"It is probable to almost certain that within, say, a couple of years, basically anyone with a bit of tech smarts will be able to create highly persuasive video or audio of more or less anyone in the public domain saying or doing more or less what they want on a video and then disseminate it," he said.
ASI was called in because of its success in making tools to automatically spot videos made by the Islamic State group being spread on social media.
Those well-produced "official" videos were key to the radicalisation of many people who carried out "lone wolf" attacks in London and other cities, said Mr Gibson.
"There are particular classes of video that cause the real damage," he said. "They are slick and well-produced.
"The quality of the content matters because you can start to persuade people that are sceptical. These are so troubling because they are so visceral."
As the Deep Fake technology improves, it might be used to generate the convincing clips that can significantly damage debate and undermine legitimate elections.
Big web platforms such as Facebook and YouTube did a lot of their own work to find and flush out IS propaganda, said Mr Gibson, but smaller firms need help to scrutinise the huge amount of video flowing online. The same will be true of Deep Fakes.
Systems based around machine learning and AI can do the job of finding content and processing video far faster than humans can, he said.
Research suggests that the IS videos appeared online via more than 400 different platforms, said Mr Gibson. The Deep Fakes are likely to be uploaded through at least as many routes.
"If you are spreading fake news it does not matter to you where it is, it's not like you get more status if it's on YouTube," he said. "You just want people to look at it.
"As long as it is on the open web and as long as you can cut and paste a link to it in a message the job is basically done," he told the BBC.
There have already been efforts to combat election interference, most particularly during Mexico's recent presidential election.
"Mexico has a long history of social network manipulation that goes way back," said Tom Trewinnard, director of programmes at media firm Meedan which helped to run a project to combat fake news and disinformation in the country called Verificado.
The electronic disruption intensified during the 2018 election. One of the most public examples took place during the final television debate between presidential candidates on 12 June.
During the debate, Ricardo Anaya, of the National Action Party, revealed that its website was making public some documents that criticised leading candidate Andres Manuel Lopez Obrador.
While the debate was under way, the site was hit by a sustained cyber-attack and was knocked offline for hours.
Other interference included hashtag poisoning on Twitter.
This, said Mr Trewinnard, involves a campaign flooding Twitter with posts related to a trending tag that supports a rival.
"That triggers Twitter's spam filters which kills the hashtag from the trending feed," said Mr Trewinnard. - BBC