The Internet, and social media in particular, has been a boon for child predators and abusers. A plan by Facebook to implement end-to-end encryption across its services, including on Instagram, WhatsApp and Facebook Messenger, is going to make the problem worse.
In 2018, the CyberTipline at the National Center for Missing and Exploited Children (NCMEC) received over 18 million reports of apparent child sexual abuse images, online enticement, child sex trafficking, and child sexual molestation. Perhaps more startling than this volume of more than 2,000 reports per hour is that last year’s reports account for 40 percent of all reports received by the CyberTipline since its inception in 1984.
Most major technology companies have deployed technology that has proven effective at disrupting the global distribution of known child sexual abuse material (CSAM). This technology, photoDNA, developed by Microsoft and donated to NCMEC, works by extracting a distinct digital signature from known CSAM and comparing these signatures against content at the point of upload. Flagged content can then be instantaneously removed and reported.
FACEBOOK'S SECRET PROJECT COULD BE USED TO SPY ON YOU
This type of robust hashing technology is similar to that used to detect other harmful digital content like viruses and malware. Since its development in 2009 and its eventual world-wide deployment, photoDNA remains one of the most effective strategies for combatting child sexual abuse online.
The efficacy of this technology, however, is under threat.
Earlier this year, Facebook’s Mark Zuckerberg announced that he is implementing end-to-end encryption on his platforms, preventing anyone — including Facebook — from seeing the contents of any communications.
We do not need to cripple our ability to remove some of the most harmful and heinous content in the name of an incremental amount of privacy.
In announcing the decision, Zuckerberg conceded that it came at a cost.
"At the same time, there are real safety concerns to address before we can implement end-to-end encryption across all of our messaging services," he wrote. "Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion."
There is, of course, no such thing as complete privacy or complete security. On the privacy side, even end-to-end encryption does not provide users with as much privacy as they may think. Even without the ability to read the contents of your messages, Facebook will still know with whom you are communicating, when you are communicating, from where you are communicating, as well as a trove of information about your other online activities.
This is a far cry from real privacy. And, on the security side, with or without end-to-end encryption, it will never be possible to fully secure our online or offline world. Our goal, therefore, should be to find a reasonable balance between privacy and security.
The adoption of end-to-end encryption would significantly hamper the efficacy of photoDNA. This is particularly troubling given that the majority of the millions of yearly Facebook reports to NCMEC’s CyberTipline originate on Facebook's Messaging services. Blindly implementing end-to-end encryption will significantly increase the risk and harm to children around the world.
Recent advances in encryption and robust hashing technology, however, mean that technologies like photoDNA can be adapted to operate within an end-to-end encryption system.
Specifically, when using certain types of encryption algorithms (so-called partially- or fully-homomorphic encryption), it is possible to perform the same type of robust "image hashing" on encrypted data. This means that encrypted images can be analyzed to determine if they are known illicit or harmful material without the need, or even ability, to decrypt the image.
For all other images, this analysis provides no information about its contents, thus preserving content privacy.
CLICK HERE TO GET THE FOX NEWS APP
Alternatively, photoDNA can be implemented at the point of transmission, contrary to the current approach where it is implemented upon receipt. In this client-side implementation, the distinct signature is extracted prior to encryption and transmitted alongside the encrypted message. Because no identifying information can be extracted from this signature, it does not reveal any details about the encrypted image while allowing for the monitoring of known CSAM and other harmful material.
We do not need to cripple our ability to remove some of the most harmful and heinous content in the name of an incremental amount of privacy. Zuckerberg has repeatedly expressed his desire to "get it right" this time. The technology exists to get it right. We now just need the will to do so.