“…this new single-minded focus on CSAM in the revived anti-encryption push feels like an exceedingly cynical move on the part of the U.S. government … One proposal … is to build a system where the provider … would check content … before it’s encrypted and transmitted … i.e. while the content is on the sender’s device … to try to figure out whether that content is or might be abusive content such as CSAM.” – Riana Pfefferkorn, Oct 7,
Note the date.
Also, read Riana’s follow-up, in May 2020, titled “client-side scanning and Winnie-the-Pooh redux (plus some thoughts on Zoom).”
According to this timeline, it’s hard to see Apple’s “client-side scanning” plans as anything but the brainchild of Barr and the FBI during the Trump administration in their war on end-to-end encryption. “Terrorism” didn’t work, so CSAM it is. “Ghost user” didn’t work, so “client-side scanning” it is.
...it is staggeringly naive to believe that ... client-side pre-encryption “content moderation” would stop at CSAM.
If you ... accept ... censoring ... because it might make it a little easier for the government to catch the most inept pedophiles, then I’m not sure I’ve got a lot else to say to you.
...before very long from now, you won’t be able to say what you please ... while pedophiles learn how to continue operating without detection.
This is not good for true ownership of our devices as it can only work if users can't control their devices enough to circumvent it.
@aral Yeah, Facebook actually had that built by the time
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!