Deepfake videos could destroy trust in society—here's how to restore it

  
Deepfake videos could destroy trust in society - here’s how to restore it
Credit: Andriano.cz/Shutterstock

It has the potential to ruin relationships, reputations and our online reality. "Deepfake" artificial intelligence technology promises to create doctored videos so realistic that they're almost impossible to tell from the real thing. So far it has mostly been used to create altered pornographic clips featuring celebrity women's faces but once the techniques are perfected, deepfake revenge porn purporting to show people cheating on their partners won't be far behind.

But more than becoming a nasty tool for stalkers and harassers, deepfakes threaten to undermine trust in political institutions and society as a whole. The White House recently justified temporarily banning a reporter from its press conferences using reportedly sped up genuine footage of an incident involving the journalist. Imagine the implications of seeing ultra-realistic but artificial footage of government leaders planning assassinations, CEOs colluding with foreign agents or a renowned philanthropist abusing children.

So-called fake news has already increased many people's scepticism towards politicians, journalists and other public figures. It is becoming so easy to create entirely fictional scenarios that we can no longer trust any video footage at face value. This threatens our political, legal and media systems, not to mention our personal relationships. We will need to create new forms of consensus on which to base our social reality. New ways of checking and distributing power—some political, some technological—could help us achieve this.

Fake scandals, fake politicians

Deepfakes are scary because they allow anyone's image to be co-opted, and call into question our ability to trust what we see. One obvious use of deepfakes would be to falsely implicate people in scandals. Even if the incriminating footage is subsequently proven to be fake, the damage to the victim's reputation may be impossible to repair. And politicians could tweak old footage of themselves to make it appear as if they had always supported something that had recently become popular, updating their positions in real time.

There could even be public figures who are entirely imaginary, original but not authentic. Meanwhile, video footage could become useless as evidence in court. Broadcast news could be reduced to people debating whether clips were authentic or not, using ever more complex AI to try to detect deepfakes.

But the arms race that already exists between fake content creators and those detecting or debunking disinformation (such as Facebook's planned fake news "war room") hides a deeper issue. The mere existence of deepfakes undermines confidence and trust, just as the possibility that an election was hacked brings the validity of the result into question.

While some people may be taken in by deepfakes, that is not the real problem. What is at stake is the underlying social structure in which we all agree that some form of truth exists, and the social realities that are based on this trust. It is not a matter of the end of truth, but the end of the belief in truth – a post-trust society. In the wake of massive disinformation, even honest public figures will be easily ignored or discredited. The traditional organisations that have supported and enabled consensus – government, the press – will no longer be fit for purpose.

Blockchain trust

New laws to regulate the use of deepfakes will be important for people who have damaging videos made of them. But policy and law alone will not save our systems of governance. We will need to develop new forms of consensus, new ways to agree on social situations based on alternative forms of trust.

One approach will be to decentralise trust, so that we no longer need a few institutions to guarantee whether information is genuine and can instead rely on multiple people or organisations with good reputations. One way to do this could be to use blockchain, the technology that powers Bitcoin and other cryptocurrencies.

Blockchain works by creating a public ledger stored on multiple computers around the world at once and made tamper-proof by cryptography. Its algorithms enable the computers to agree on the validity of any changes to the ledger, making it much harder to record false information. In this way, trust is distributed between all the computers who can scrutinise each other, increasing accountability.

More democratic society

We can also look to more democratic forms of government and journalism. For example, liquid democracy allows voters to vote directly on each issue or temporarily assign their votes to delegates in a more flexible and accountable way than handing over full control to one party for years. This would allow the public to look to experts to make decisions for them where necessary but swiftly vote out politicians who disregarded their views or acted dishonestly, increasing trust and legitimacy in the political system.

In the press, we could move towards more collaborative and democratised news reporting. Traditional journalists could use the positive aspects of social media to gather information from a more diverse range of sources. These contributors could then discuss and help scrutinise the story to build a consensus, improving the media's reputation.

The problem with any system that relies on the reputation of key individuals to build trust is how to prevent that reputation from being misused or fraudulently damaged. Checks such as Twitter's "blue tick" account verification for public figures can help, but better legal and technical protections are also needed: more protected rights to privacy, better responses to antisocial behaviour online, and better privacy-enhancing technologies built in by design.

The potential ramifications of deepfakes should act as a call to action in redesigning systems of trust to be more open, more decentralised and more collective. And now is the time to start thinking about a different future for society.

Explore further: Misinformation woes could multiply with 'deepfake' videos