Meta, formerly known as Facebook, stated today that it will not implement end-to-end encryption (E2EE) on its platform, including Instagram, until “sometime in 2023,” This was made known by Antigone Davis, Meta’s global head of safety.
“We’re taking our time to get this right,” Antigone Davis wrote, “and we don’t aim to finish the global rollout of end-to-end encryption by default across all our messaging services until sometime in 2023.” “We’re determined to secure people’s private conversations and keep them safe online as a company that links billions of people worldwide and has established industry-leading technology.”
The delay, according to Davis, is due to the social media giant’s desire to guarantee it can execute the technology properly — in the sense of retaining the capacity to send information to law authorities to assist in child safety investigations.
“As we do so, there is a raging debate over how tech companies can continue to fight abuse and support law enforcement’s essential job if we can’t read your messages. We believe people shouldn’t have to choose between privacy and safety, which is why we’re incorporating strong safety measures into our plans and working with privacy and safety experts, civil society, and governments to ensure we get this right,” adding that the company will use “proactive detection technology” to identify suspicious patterns of activity, as well as enhanced controls and the ability for users to report problems.”
Since its public revelation of its aim to “e2ee all the things” over two years ago, Western governments, notably the United Kingdom, have been pressuring Facebook to delay or abandon its plan to blanket services in the strongest degree of encryption. The UK home secretary, Priti Patel, condemned Mark Zuckerberg’s social media empire as “just unacceptable.”
Private messaging, according to the National Society for the Prevention of Cruelty to Children (NSPCC), is the “frontline of child sexual abuse online” because it protects messages from law enforcement and tech platforms by ensuring that only the sender and recipient can see their content – a process known as end-to-end encryption.
In response to the worry by NSPCC, Davis asserted that Meta would be able to detect abuse under its encryption plans by analyzing non-encrypted data, account information, and user reports. WhatsApp has already been allowed to make reports to child safety authorities using a similar way. “We would have been able to disclose essential information to the authorities even if those services had been end-to-end encrypted,” according to a recent analysis of some prior situations.
While WhatsApp, owned by Facebook (now known as Meta), has had E2EE since 2016, most internet giants’ services do not assure that only the user has keys to decode communications data. Those providers might be subpoenaed or served with a warrant to hand message data to government officials.
However, in the aftermath of the Cambridge Analytica data misuse incident, Facebook CEO Mark Zuckerberg said in 2019 that the firm will work toward deploying end-to-end encryption across all its services as part of a “pivot to privacy.” Although Zuckerberg did not provide a clear date for the rollout, Facebook stated that it would be completed by 2022.
End-to-end encryption protects data by scrambling or encrypting it as it travels between phones and other devices. Getting physical access to an unlocked device that sent or received the message is usually the only way to read it.
Recall that last month, Facebook whistleblower Frances Haugen expressed concern about the company’s use of the technology, claiming that because it’s proprietary (as opposed to open source) implementation, users must trust Meta’s claims because independent third parties can’t verify the code does what it claims. She also stated that outsiders cannot know how Facebook interprets E2EE, which is why she is concerned about the company’s plan to extend E2EE use, “because we have no idea what they will do,” as she put it.
“We don’t know what that means, and we don’t know whether people’s privacy is genuinely safeguarded,” Haugen told UK legislators, adding, “It’s incredibly complex, and it’s also a different environment.” There is no directory where you can locate 14-year-olds on the open-source end-to-end encryption software that I like to use, and there is no directory where you can go to find the Uyghur community in Bangkok. Accessing vulnerable populations on Facebook is trivially easy, and national state actors are doing so.” Haugen was cautious in supporting E2EE, stating that she prefers open-source security technology implementations involving external experts.
Every day, 2.8 billion individuals utilize Meta’s apps. In 2020, the internet industry reported more than 21 million cases of child sexual abuse on its platforms to the US National Center for Missing and Exploited Children. Facebook was responsible for more than 20 million of those reports.
Enjoyed this post? Never miss out on future posts by following us.»