You’ve got nothing to hide so you’ve got nothing to fear, right?
“A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.”
https://inkl.com/a/EbYLJlcYkxY
(Remember Apple was/is(?) going to implement this sort of scanning by default on all your devices.)
Wow that's scary.
I've taken photos of my son when he's had a medical problem to send to the docs when a visit wasn't possible. Nothing that includes genitals thankfully, but an infected and swollen penis in young boys is not uncommon. That happened to my son twice IIRC, both requiring medical help to clear up.
Given doctors are seeing patients less these days sending pics to complement a phone call is quite normal, so more parents are going to get caught by this.
Also, given that Google wants people to abandon passwords in favour of using their mobile phone as the key for passwordless sign-ins across the web, the consequences of having that key unceremoniously removed with no warning could be devastating.
That's not a feature I'm ever going to use personally. I've had too many issues with phone problems over the years. @bitwarden ticks all my boxes and I'm sticking with it.
That's not a feature I'm ever going to use
I appreciate the sentiment but I expect a time will come when the choice will be made for you.
I've been trying to walk this kind of path and am finding my ability to maintain bulwarks against wholesale data hoovering being carved away on many fronts.
My bank has just started using device verification to confirm online purchases before it will authorize them, specifically the app on my phone. I've not tried logging into my web account yet to see if the option is there too. I hope so. The prospect of a phone failure (for any reason) cutting me off from using my bank account is not a pleasant one.
Is your concern Google, or is your concern passwordless logins?
Passwordless logins tied to my/any mobile phone. Apple have designs to implement this feature too. If it becomes ubiquitous then the temptation for websites and services to buy into this will be very strong.
If it's a technical argument, then unless there's some new aspect of this that I'm unaware of, it's about the use of FIDO keys, and simply storing the keys on the phone.
If I'm right, then we have to ask whether the phone or a specialty token is better (and we can quantify what better means).
I agree that storing a cornerstone cryptographic key on a system you don't control is pretty darn scary, and the risks associated are huge.
@aral
Shocking! We were all forced to take photos to send to doctors during lockdown! How many more people have been caught out like this?!
@aral Not implausible voice analysis (Google Home, Alexa et al) will be used similarly, looking for supposed criminal intent in what is spoken.
@aral as a Google photos user and a father of a toddler, this has been on my mind practically the entire time, especially now as we vacation on the Mediterranean coast and my kid is swimming naked sometimes. You're not paranoid if you're right, or however that saying goes. Sadly, I can't switch to some other, self hosted solution due to the high technological barrier to entry.
@aral I've got some mixed feelings about these things:
1. On one hand, making sure that child pornography doesn't proliferate is a battle worth fighting - while being well aware that automatic detection of suspicious content will necessarily violate people's privacy, and that finding a trade-off may prove impossible.
2. On the other hand, *real* pedophiles are probably already aware that they could end up in jail for the content that they store and exchange, so they are already likely to employ strategies to bypass these checks - from E2EE used to store and exchange content, to just not using Google/Apple photos or camera apps on their phones.
It'd be nice for Google and Apple to disclose how many real pedophiles they managed to catch thanks to their media content checks, and how many were false positives like this poor father sharing pictures of his son to his doctor. If the number of false positives outweighs the number of bad guys caught, there should be no doubt that such checks should be ditched. However, of course, Google and Apple will never disclose such data - like many others, they care more of having a handy excuse to justify their invasive surveillance strategies rather than actually protecting the weakest. Had they really cared of fighting child pornography, they would have directed their resources and efforts to the copious amount that is shared on the dark web.
And, even if such checks were actually effective and legitimate, Google ought to apologize to those caught in the network for no fault of theirs. Imagine being a father with a sick child who suddenly has to explain to the police why he sent a picture of his kid to his doctor. Imagine what an ugly and embarrassing moment it must be. What really enraged me was Google's reaction - after the father had *already* been interrogated and cleared of all charges. They responded with their usual impersonal, faceless and dull corporate message - "the safety of our customers and fighting online crime are our priorities bla bla bla". Not a single word of apology to a father who had to spend hours in a police station while unjustly reported for something as ugly as child pornography.
@blacklight@social.platypush.tech @aral@mastodon.ar.al probably not many. pop the pictures in a password protected zip file, it's extremely mundane to bypass if your actually trying to at all.
can't check the pictures if there is no pictures to check. #
@aral I believe apple was implementing detection of previously known images by signature, not doing some sort of automatic discovery of new, offending images.
@dfraser Indeed, version 1 was to have hash-based detection (which has already been broken via proof of false positives). Another difference was that it was going to take place on your device. But all of these are slippery slopes. And Apple hasn’t abandoned its plans either (at least it hasn’t committed to doing so publicly) so they can resurface at any point (likely with better public relations this time).
@aral @thatkruegergirl Apple is matching known images not content.
So don’t share it until it ends up a known image, since that means you’re distributing child pornography by definition.