Points Highlighted:
“The day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used ,” says Snowden. Moreover, while he cites “compelling evidence” from researchers that Apple’s CSAM detection system is seriously flawed, he draws attention to a much bigger point:
In a stunning new post, Edward Snowden has delved into Apple’s CSAM (child sexual abuse material) detection system coming to Apple’s approx 1.65BN active iPhones, iPads and Macs next month. He states: “Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.” He also shows what you can do about it — for now.
CSAM detection works by matching a user’s images to illegal material. “Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and… if enough “forbidden content” is discovered, law-enforcement will be notified,” Snowden explains. “Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.”
Apple’s 1.65 billion iPad, iPhone and MacBook users received a serious warning
Check all news and articles from the latest Security news updates.
“Apple gets to decide whether or not their phones will monitor their owners’ infractions for the government, but it’s the government that gets to decide what constitutes an infraction… and how to handle it.”