Can anything protect us from deepfakes?

In late 2017, Motherboard reported on an AI technology that could swap faces in videos. At the time, the tech—later called deepfakes—produced crude, grainy results and was mostly used to create fake porn videos featuring celebrities and politicians.

Two years later, the technology has advanced tremendously and is harder to detect with the naked eye. Along with fake news, forged videos have become a national security concern, especially as the 2020 presidential elections draw near.

Since deepfakes emerged, several organizations and companies have developed technologies to detect AI-tampered videos. But there's a fear that one day, the deepfakes technology will be impossible to detect.

Researchers at the University of Surrey developed a solution that might solve the problem: instead of detecting what's false, it'll prove what's true. Scheduled to be presented at the upcoming Conference on Computer Vision and Pattern Recognition (CVPR), the technology, called Archangel, uses AI and blockchain to create and register a tamper-proof digital fingerprint for authentic videos. The fingerprint can be used as a point of reference for verifying the validity of media being distributed online or broadcasted on television.

More From PCmag

Using AI to Sign Videos

The classic way to prove the authenticity of a binary document is to use a digital signature. Publishers run their document through a cryptographic algorithm such as SHA256, MD5, or Blowfish, which produces a "hash," a short string of bytes that represents the content of that file and becomes its digital signature. Running the same file through the hashing algorithm at any time will produce the same hash if its contents haven't changed.

Hashes are supersensitive to changes in the binary structure of the source file. When you modify a single byte in the hashed file and run it through the algorithm again, it produces a totally different result.

But while hashes work well for text files and applications, they present challenges for videos, which can be stored in different formats, according to John Collomosse, professor of computer vision at the University of Surrey and project lead for Archangel.

"We wanted the signature to be the same regardless of the codec the video is being compressed with," Collomosse says. "If I take my video and convert it from, say, MPEG-2 to MPEG-4, then that file will be of a totally different length, and the bits will have completely changed, which will produce a different hash. What we needed was a content-aware hashing algorithm."

To solve this problem, Collomosse and his colleagues developed a deep neural network that is sensitive to the content contained in the video. Deep neural networks are a type of AI construction that develops its behavior through the analysis of vast amounts of examples. Interestingly, neural networks are also the technology at the heart of deepfakes.

When creating deepfakes, the developer feeds the network with pictures of a subject's face. The neural network learns the features of the face and, with enough training, becomes capable of finding and swapping faces in other videos with the subject's face.

Archangel's neural network is trained on the video it's fingerprinting. "The network is looking at the content of the video rather than its underlying bits and bytes," Collomosse says.

After training, when you run a new video through the network, it will validate it when it contains the same content as the source video regardless of its format and will reject it when it's a different video or has been tampered with or edited.

According to Collomosse, the technology can detect both spatial and temporal tampering. Spatial tamperings are changes made to individual frames, such as the face-swapping edits done in deepfakes.

But deepfakes are not the only way videos can be tampered with. Less discussed but equally dangerous are intentional changes made to the sequence of frames and to the speed and duration of the video. A recent, widely circulated tampered video of House Speaker Nancy Pelosi did not use deepfakes but was created through the careful use of simple editing techniques that made her appear confused.

"One of the forms of tampering we can detect is the removal of short segments of the video. These are temporal tampers. And we can detect up to three seconds of tampering. So, if a video is several hours long and you just remove three seconds of that video, we can detect that," Collomosse says, adding that Archangel will also detect changes made to the speed of the original video, as was done in the Pelosi video.

Registering the Fingerprint on the Blockchain

The second component of the Archangel project is a blockchain, a tamper-proof database where new information can be stored but not changed—ideal for video archives, which don't make changes to videos once they've been registered.

Blockchain technology underlies digital currencies such as Bitcoin and Ether. It's a digital ledger maintained by several independent parties. The majority of the parties must agree on changes made to the blockchain, which makes it impossible for any single party to unilaterally meddle with the ledger.

It's technically possible to attack and change the content of a blockchain if more than 50 percent of its participants collude. But in practice, it's extremely difficult, especially when the blockchain is maintained by many independent parties with varying goals and interests.

Archangel's blockchain is a bit different from public blockchain. First, it doesn't produce cryptocurrency and stores only the identifier, the content-aware fingerprint, and the binary hash of the verifier neural network for each video in an archive (blockchains are not suitable for storing large amounts of data, which is why the video itself and the neural network are stored off-chain).

Also, it's a permissioned or "private" blockchain. This means that unlike Bitcoin blockchain, where everyone can record new transactions, only permissioned parties can store new records on the Archangel blockchain.

Archangel is currently being trialed by a network of national government archives from the UK, Estonia, Norway, Australia, and the US: To store new information, every involved country has to underwrite the addition. But while only those countries' National Archives have the right to add records, everyone else has read access to the blockchain and can use it to validate other videos against the archive.

"This is an application of blockchain for the public good," Collomosse says. "In my view, the only reasonable use of the blockchain is when you have independent organizations that don't necessarily trust one another but they do have this vested interest in this collective goal of mutual trust. And what we're looking to do is secure the National Archives of government all around the world, so that we can underwrite their integrity using this technology."

Because creating forged videos is becoming easier, faster and more accessible, everyone's going to need all the help they can get to ensure the integrity of their video archives— especially governments.

"I think deepfakes are almost like an arms race," Collomosse says. "Because people are producing increasingly convincing deepfakes, and someday it might become impossible to detect them. That's why the best you can do is try to prove the provenance of a video."

This article originally appeared on PCMag.com.