Tech battles creepy revenge porn

File photo. (REUTERS/Kacper Pempel)

Face swapping is cool but it definitely has a sinister side. New software is trying to combat this.

Face swapping is already widely available via apps like Snapchat and Face Swap live. While it's meant to be a fun diversion, it can also be used maliciously. For instance, revenge porn that swaps the face of an ex-spouse onto a person in an explicit video.

“Pornographic videos called ‘deepfakes’ have emerged on websites such as Reddit and 4Chan showing famous individuals’ faces superimposed onto the bodies of actors,” according to a report at MIT Technology Review.

HOW AI-GENERATED VIDEOS COULD BE THE NEXT BIG THING IN FAKE NEWS

The report continues. “At the very least, it has the potential to undermine the reputation of people who are victims of this kind of forgery.”

To combat this, Andreas Rossler at the Technical University of Munich in Germany, working with other academics, has developed a system for detecting forgeries.

In an abstract describing the technology, the researchers say that some uses “raise a legitimate alarm,” making it necessary to develop “reliable detectors of fake videos.”

While it’s extremely difficult for humans to distinguish between fakes, it’s challenging for computers too, according to the researchers. “Especially when the videos are compressed or have low resolution, as it often happens on social networks,” the research notes.

In a YouTube video explaining the technology, the research shows video clips where it is virtually impossible to tell that the facial expressions have been manipulated. For example, a video clip shows the face of Russian president Vladimir Putin, with the original video and the manipulated versions side by side. The researchers detect the “manipulated pixels” of an image to determine if it’s a fake.

APPLE'S NEW IOS 11.3 IS HERE: WHAT YOU NEED TO KNOW

To date, research on face manipulation has been hampered by a “lack of…datasets,” according to Rossler and his team. So, they created a “face manipulation dataset” of about half a million edited images from over 1000 videos.

Their “deep learning algorithm” to detect fakes uses this set of images, according to the MIT Technology Review report.

But bad actors can take advantage of this too. “The same deep-learning technique that can spot face-swap videos can also be used to improve the quality of face swaps in the first place—and that could make them harder to detect,” according to the MIT Technology Review.

But Rossler and his team have found that even when the visual quality of the forgery is refined, it does not have much effect on the technology they use to detect it.

Load more..