Apple attempts to position itself as a leader in privacy rights, but that’s just grandstanding. Apple has announced plans to search private pictures on its customers’ devices in an attempt to stop child abuse.
The resulting violation of privacy is so far-reaching that there’s now a debate over the ethics of Apple’s plans. The Electronic Frontier Foundation (EFF), which describes itself as “the leading nonprofit organization defending civil liberties in the digital world,” said it was an “understatement” to say the organization was “disappointed by Apple’s plans.”
“Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again,” the digital rights group said in a report. “Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”
There are two main features that Apple is planning to install on all of its devices. The first is a scanning feature that would use an algorithm to scan all iCloud photos for what Apple calls “Child Sexual Abuse Material (CSAM).” The second scans all iMessage images sent or received by accounts designated to be owned by minors for CSAM.
“Apple’s method of detecting known CSAM is designed with user privacy in mind,” the company claimed in a press release. Apple continued: “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image.”
EFF isn’t buying Apple’s claims of keeping privacy in mind. The organization argued that Apple is “opening the door to broader abuses,” and asserted that “it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”
Apple does currently have the ability to view photos stored in iCloud Photos, according to EFF, but it does not scan them. Civil liberties groups have reportedly asked Apple to relinquish this access. Rather, Apple has chosen to move in the opposite direction through the implementation of its photo scanning features.
Conservatives are under attack. Contact your representatives and demand that Big Tech be held to account to mirror the First Amendment while providing transparency, clarity on “hate speech” and equal footing for conservatives. If you have been censored, contact us at the CensorTrack contact form, and help us hold Big Tech accountable.