I don’t understand how “client-side scanning” - i.e. an invasive piece of code pushed by OS makers to YOUR computer or mobile device to scan YOUR files without your consent - is even being discussed.
This is tantamount to an Apple or Google rep forcibly entering your house, sitting on the couch next to you in your living room and reporting to the mothership or the police what you watch on TV. People would take to the street if this was mandated by law. Yet they seem to be waiting for the Apple or Google rep to sit on their device and report what files you have in it with complete resignation.
How did we get here? This obscene proposal would have been a major scandal not 25 year ago. Actually it wouldn’t even have been proposed at all. But today it’s on the verge of becoming law! The mind boggles…
deleted by creator
If the system was transparent, open, and provided an easy way to get false positive sorted, I wouldn’t necessarily even have a problem with the concept.
How can you even say that?
This is what baffles me the most: how does anyone even entertain the idea of letting a third-party scan their own files on their own device uninvited? Even if the process is transparent and there’s a 100% fool-proof way of taking care of false positive, the very idea of letting anyone scan anything on my computers in the first place is completely unacceptable!
People would have never deeemed anything like this even remotely acceptable 25 years ago. But in 2023, enough people have internalized the idea enough that this actually has a chance to become law without creating an outrage. I am utterly distressed by what society is willing to accept nowadays.
deleted by creator
Your entire line of thinking hinges on the premise that the politicos (and presumably, whichever oligopolies their do the biddings of) will have their way one way or the other. What you’re saying is, if we don’t make concessions on the client-side scanning and accept some implementation of it, the privacy-respecting tools we have now will be banned.
My question is this: why is any of this inevitable?
None of what’s being proposed here solves any problem. Pedo material can be fought with the legal and technical tools we have now, as demonstrated by the news of entire pedo rings being dismantled, and pedophiles going to jail as a result on a regular basis.
The fact that you’re willing to make compromises on solutions to a fake problem means that you’ve already acknowledged we’ve already lost.
The truth is, if people today were as outraged as people of my generation are over this, this false choice wouldn’t have to be made at all. Things are just fine the way they are today, and you don’t have to give up anything if you don’t assume you’ll have to give something up.
deleted by creator
The problem is that the criminals won’t use something monitored by the police. They aren’t dumb
Google has been doing it on drive for years now. False positives have been several times reported to the police, despite a human reviewing it
Yeah but that’s different: you entrust files to Google drive. It’s their digital real estate: I expect them to do whatever they want with what you put on it. If you don’t want false-positives, don’t send your files to Google.
But your cellphone or your computer at home is your digital real-estate. It’s your home. I for one do not welcome Google in my home, and I absolutely refuse to let them see what’s inside my home.
Because really, client-side scanning is nothing more than home invasion.
Oh, something very similar has been proposed already some time ago, just under the guise of stopping terrorism. That excuse evidently doesn’t work anymore.
It’s already being discussed to be put in law but people still aren’t rioting. Chat control 2.0 is just this.
Johansson, however, has not blinked. “The privacy advocates sound very loud,” the commissioner said in a speech in November 2021. “But someone must also speak for the children.”
Fuck the children.
Well, don’t fuck the children…
Yep fucking children will only add more coal to the fire. Fuck the pedos though
The arrogance of her statement is really frustrating. People who know more about this domain than you do are telling you it’s a bad idea, you shithead!
You know what? I know what you mean, but on the internet of 2023, I would never post that last line on a forum for fear of it being archived by Google and used against me in some form or other years later. This is the sort of self-censorship one has to do these days.
Keep in mind that one of the leading organizations pushing laws like this is Thorn. You know, the one Ashton Kutcher ran. You know, the guy who sent a letter to a judge acting for leniency for a convicted serial rapist. All these laws are smokescreens to take away people’s right to privacy and dissent.
Once a system like this is up and running, nothing is stopping a government from abusing it.
Oh actually, we think it’d be a good idea to broaden its capabilities to do stuff we’ve never explicitely mentioned before. You don’t mind, do you?
I’m all for the fight against child abuse but these actions are under the guise of fighting child abuse. Now if government implemented awareness, destigmatization of abuser so they can seek help. And dealt with core issues rather then chasing the shit storm thats already been and gone, well wed have a far better society. Its really about control though, once you’ve got money, a high societie social group what else is there
It’s never about creating a better society, just oppression and exploitation. The more criminals you easily fabricate, the more indentured servants you easily get.
exactly. its a joke
They may benefit from it, but it’s pretty hard to believe that a bunch of sleazy “AI can do everything” snake oil salesmen, along with the politicians and lobbyists they’ve bought, got to be this influential and well-funded on their own. It’s not as if their arguments are all that convincing on their merits.
I wish a big company would go against the grain on the child protection issue.
Everyone wants to protect children, but child predators aren’t going to be storing their abuse materials on the cloud.