“Facebook chief operating officer Sheryl Sandberg said the company's massive secret experiment designed to purposefully manipulate the emotions of its users was communicated ‘poorly.’

‘It was poorly communicated. And for that communication we apologize. We never meant to upset you.’”


Oh, sorry, did I put the wrong link in there?

Also, read this: cyberlaw.stanford.edu/blog/201

We’re not confused about what Apple is doing. They’re implementing client-side scanning… as requested by Barr/The FBI all the way back to end of 2019.

And, as the Federighi interview shows, they’re barefaced lying about it. They know exactly what they’re doing and why.

I’ve lost all trust in this company.

(The only trust I had anyway was because of their business model.)

Apple is just as bad as Google, if not worse, at this point.

Is there an out for Apple? Can they still salvage their reputation? Maybe.

Let’s talk in words corporations understand:

1. Fire the VP responsible for this decision. One of those folks who takes the stage at your keynotes has to go.

2. Personal apology by Tim Cook admitting they were wrong (no if/buts) and promising to implement a contractual/constitutional promise they will never implement any feature that uses data from your device to inform a third party or manipulate you (includes ads)…

… This must include a firm and explicit contractual/constitutional promise to never implement client-side scanning, “ghost users” or any other circumvention of your privacy/end-to-end encryption in any jurisdiction.

I don’t see why anyone in the West should trust Apple for anything less than this. (And that’s if you ignore the whole China thing.)

Do I see this happening? Not really.

So, to reiterate what I said earlier:

You should no longer trust Apple on privacy.

@aral Should we have ever trusted them in first place, they were part of Snowden revelations. Apple privacy stand was always a joke. I really wonder why there is blind faith on them. They are into advertisements. I emailed to their privacy related email address, asking them why they suggested some of the adds as they had no online trace, guess what, they redirected me to terms and condition page. You do something on Google platform it will follow you in App Store. #privacy #apple

@aral I'm not sure that I would restrict this to Apple. The entire discussion about e2e encryption in the EU silently acknowledges that weakening encryption is bad, so the preferred way around that is client side scanning.

@jens Oh, I’m not limiting it to them. I haven’t trusted the others for a long time.

Also, having end-to-end encryption is quaint when you have client-side scanning. It’s like handing them clothes after you’ve seen them naked.

@aral Yeah. And yet it might be the only workable solution that satisfies all parties at least somewhat.

The proposal would be to hash images client side and compare them to a database of known bad content, then flag the account and related content for human inspection on a side channel.

A legal framework could e.g. include a legal requirement for a warrant when the match was detected before proceeding to the next step.

It'd not be that different from email now.

My counter-proposal would be to permit private communications.

We trust people to have a talk in the park or completely unobserved interaction in their homes, bedrooms, bathrooms... so why on earth would we insist that there must be automated control of the content of their phones at all times? That's rubbish.

*if* there is reasonable suspicion, then law enforcement should get permission to crack a phone, on the same conditions (which could be tighter...) as searching their home.

@Mr_Teatime I have no interest in defending something I don't like, but I feel like you misunderstood my post there, and sort of forced me to do so.

Can I ask instead what you consider the main difference between what I wrote and what you did?


Hmm, could be. Aral said »It’s like handing them clothes after you’ve seen them naked.«, and you replied »And yet it might be the only workable solution...«

So it looks as if you were defending automated searches of private contents as some sort of compromise.

To which I say: No way.
Any individual phone can be cracked if there's a sufficiently large need. Automated searches are a violation of presumprion of innocence, a huge security liability, and a gift to autocrats.

@Mr_Teatime I agree with you. I'm not in favour of automated searches, either.

What I am saying is that the current discussion tends to be between two interest groups: one wants to keep e2e encryption intact, the other wants to have a way for going after child porn networks (that's the argument at any rate). Both have a legitimate interest, and the balance might be client side classification.

The third largely uninvolved interest group wants to keep privacy and the...

@Mr_Teatime ... presumption of innocence intact. They are not as involved in the discussion as they could be, at least from the angle that I can see.

They tend to get sort of subsumed into the e2e advocacy group X but technically they're distinct interests. Might be worth expressing as such to the EC.

I don't think that the e2ee and presumption of innocence groups are that separate.

The reason for supporting e2ee is that preventing private communications breaks civil liberty, and eventually democracy. And Apples proposed feature breaks e2ee, so I don't really see the difference.

Now, technically you might say that's not true, but if someone has a spy on one (or both) of the ends, then encrypting communications between them becomes an attempt to limit the damage.

@Mr_Teatime Well, in the parts of the debate that I see - which involves interest groups discussing with EC representatives, or formulating public statements, that kind of thing - there is very little talk of how privacy invasion in whichever form undermines democracy. Somehow that topic is tacitly left unmentioned. That's all I'm saying.

Protecting democracy is the reason why privacy laws even exist, and why they have constitutional weight in Germany, for example.

We've had two states on German territory in thenlast century alon who used the abswnce of privacy to their great advantage in defending *against* democracy. Every child learns that in school.

...and then they forget it again because crime is evil and we must not let civil rights stand in its way or something...


They are part of prism though as we have learned among other other goverment programs.
In china they have broken encryption entirely.
Plus everything they do is closed source.

We shouldnt trust them with privacy to begin with.

@msavoritias @aral for all their public-facing sophistication when the curtain gets peeled back you see that their systems are all just cobbled together off-the-shelf products: samcurry.net/hacking-apple/

@oppen @aral

Thats no suprise here.

To be honest i am always surprised when people bring apple as a cutting edge tech company or privacy advocate.

Outside of the programs i mentioned earlier they even send a hash of every install you do on your mac *cleartext* back to their servers.

@aral People will forget soon. Nothing can break their blind love to that device.

@aral I feel like it's desperate to ask for resignation and apologizes when we should actually be concerned that companies have the means to do what they want to do, and the only thing that stop them to is the public frowning very hard.

@aral Break monopolies, tax Apple, invest in open-source cryptography and infrastructures, don't rely on the Big Tech© to protect your interests, those are a few things we should think about instead of asking Apple not to be bad. Our governments should also be held responsible for protecting us, not spying on us with the help of their billionaires libertarian tech overlords.

@aral Yup, they ceded control of customer data on servers run by state-owned firms.

Sign in to participate in the conversation
Aral’s Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!