“Facebook chief operating officer Sheryl Sandberg said the company's massive secret experiment designed to purposefully manipulate the emotions of its users was communicated ‘poorly.’
‘It was poorly communicated. And for that communication we apologize. We never meant to upset you.’”
Oh, sorry, did I put the wrong link in there?
Ah, here’s the correct link. Sorry, I was confused.
What do I think of Apple?
I stand by every word.
Apple can no longer be trusted on privacy in the West (they already couldn’t in China as they capitulated to the Chinese government.)
Is there an out for Apple? Can they still salvage their reputation? Maybe.
Let’s talk in words corporations understand:
1. Fire the VP responsible for this decision. One of those folks who takes the stage at your keynotes has to go.
2. Personal apology by Tim Cook admitting they were wrong (no if/buts) and promising to implement a contractual/constitutional promise they will never implement any feature that uses data from your device to inform a third party or manipulate you (includes ads)…
… This must include a firm and explicit contractual/constitutional promise to never implement client-side scanning, “ghost users” or any other circumvention of your privacy/end-to-end encryption in any jurisdiction.
I don’t see why anyone in the West should trust Apple for anything less than this. (And that’s if you ignore the whole China thing.)
Do I see this happening? Not really.
So, to reiterate what I said earlier:
You should no longer trust Apple on privacy.
PS. If you’re not familiar with the Apple/China situation, read this:
@aral Should we have ever trusted them in first place, they were part of Snowden revelations. Apple privacy stand was always a joke. I really wonder why there is blind faith on them. They are into advertisements. I emailed to their privacy related email address, asking them why they suggested some of the adds as they had no online trace, guess what, they redirected me to terms and condition page. You do something on Google platform it will follow you in App Store. #privacy #apple
@aral I'm not sure that I would restrict this to Apple. The entire discussion about e2e encryption in the EU silently acknowledges that weakening encryption is bad, so the preferred way around that is client side scanning.
@jens Oh, I’m not limiting it to them. I haven’t trusted the others for a long time.
Also, having end-to-end encryption is quaint when you have client-side scanning. It’s like handing them clothes after you’ve seen them naked.
@aral Yeah. And yet it might be the only workable solution that satisfies all parties at least somewhat.
The proposal would be to hash images client side and compare them to a database of known bad content, then flag the account and related content for human inspection on a side channel.
A legal framework could e.g. include a legal requirement for a warrant when the match was detected before proceeding to the next step.
It'd not be that different from email now.
My counter-proposal would be to permit private communications.
We trust people to have a talk in the park or completely unobserved interaction in their homes, bedrooms, bathrooms... so why on earth would we insist that there must be automated control of the content of their phones at all times? That's rubbish.
*if* there is reasonable suspicion, then law enforcement should get permission to crack a phone, on the same conditions (which could be tighter...) as searching their home.
Hmm, could be. Aral said »It’s like handing them clothes after you’ve seen them naked.«, and you replied »And yet it might be the only workable solution...«
So it looks as if you were defending automated searches of private contents as some sort of compromise.
To which I say: No way.
Any individual phone can be cracked if there's a sufficiently large need. Automated searches are a violation of presumprion of innocence, a huge security liability, and a gift to autocrats.
@Mr_Teatime I agree with you. I'm not in favour of automated searches, either.
What I am saying is that the current discussion tends to be between two interest groups: one wants to keep e2e encryption intact, the other wants to have a way for going after child porn networks (that's the argument at any rate). Both have a legitimate interest, and the balance might be client side classification.
The third largely uninvolved interest group wants to keep privacy and the...
@Mr_Teatime ... presumption of innocence intact. They are not as involved in the discussion as they could be, at least from the angle that I can see.
They tend to get sort of subsumed into the e2e advocacy group X but technically they're distinct interests. Might be worth expressing as such to the EC.
I don't think that the e2ee and presumption of innocence groups are that separate.
The reason for supporting e2ee is that preventing private communications breaks civil liberty, and eventually democracy. And Apples proposed feature breaks e2ee, so I don't really see the difference.
Now, technically you might say that's not true, but if someone has a spy on one (or both) of the ends, then encrypting communications between them becomes an attempt to limit the damage.
@Mr_Teatime Well, in the parts of the debate that I see - which involves interest groups discussing with EC representatives, or formulating public statements, that kind of thing - there is very little talk of how privacy invasion in whichever form undermines democracy. Somehow that topic is tacitly left unmentioned. That's all I'm saying.
Protecting democracy is the reason why privacy laws even exist, and why they have constitutional weight in Germany, for example.
We've had two states on German territory in thenlast century alon who used the abswnce of privacy to their great advantage in defending *against* democracy. Every child learns that in school.
...and then they forget it again because crime is evil and we must not let civil rights stand in its way or something...
They are part of prism though as we have learned among other other goverment programs.
In china they have broken encryption entirely.
Plus everything they do is closed source.
We shouldnt trust them with privacy to begin with.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!