Why are the britbongs so hellbent on censorship? Is it still to target trans youth?
I hear by 2030 you won’t be able to look at your own genitals until you’re 30.
oi ya got a loisence to peep atcher willy?
You gawking at ya fanny are ya? That’s a trip in the paddy wagon it is! Straight to Wandsworth!
Finally, a guaranteed path to a disciplined Leninist mass movement.
AI scanning all photos for nudes? nothing could go wrong there.
deleted by creator
Next up: censoring of any kind of content the State doesn’t like. Which will be a lot of content
Pro-Palestinian content seems obvious. Anything against NATO, anything against Britain, anything communist. Strangely (not really) fascist iconography will be left untouched and they’ll be free to share as many fash memes as they like while the most mundane, weak left-lib anti-imperialist stuff gets blocked and gets you a message about being reported for extremism to your government.
Always the problem with this stuff in the west. The problem we must have with automated scanning of all images against CSAM hash database is this. I don’t of course have any objection to stopping pedophiles or distribution of that stuff but there is no way, just no way that having created such a system that operates against phones, laptops, tablets, everything that the WEST (5/14/21 eyes, global snooping and spying, decades of illegal surveillance, entrapment, etc) don’t begin slipping in hashes on political stuff. Oh it’ll be the most extreme stuff first, ISIS propaganda images then support for the IRA, then general communist iconography and memes. And upon learning of this the copyright industry will demand inclusion, will demand a similar system be implemented for identifying copies of movies, TV shows, etc and alerting them of violations or blocking or deleting such content when found.
The other thing is that detection of these images requires training data, literally an ai trained on TBs of CSAM. Which means that those models now exist, and if they’re being used, they will be cracked, and they will be used by people who crack them.
Which means that those models now exist, and if they’re being used, they will be cracked, and they will be used by people who crack them.
Eh I find this implausible and kind of a weak objection. People can’t even crack Denuvo and I’m sure there are lot more poor gamers than there are pedophiles who aren’t happy with existing models and methods and a burning desire for something like this. Fact is anyone close enough to this to “leak” it would almost certainly have the skills to just refine and train an open model to do the same thing with no risk of b being caught.
Fact is existing open source models can generate simulated images of children engaged in sexual conduct or nude. I don’t think this model would likely be some great and amazing leap on that (it would likely at times generate results with flaws and oddities and detect generated images with similar flaws and oddities as well same as all models currently suffer) and the only people with access would be heavily monitored and face prison time and ruining of their life if they tried to take the whole model with them. Frankly there’s no reason for this, pedophiles already have models they use and can adapt open models and feed them their own collections as well as just tune existing models trained entirely on adults and pictures of clothed children.
The actual training data (dataset) if actual CSAM is used would almost certainly come from and be done entirely in-house by the FBI, NCMEC, similar British agencies, etc who hold vast collections of CSAM. Certain top tech people who currently work on CSAM in collaboration with these agencies might be involved as well but a “leak” as you seem to suggest seems improbable.
I’m not axiomatically opposed to this technology. In a socialist state under a trustworthy party that cares for the well-being of the people and intends to use this tool for this and only this it wouldn’t be a problem. The issue stems from how this is just a stepping stone to using the same cracked open door that this creates to implement detection of communist content, of anti-imperialist content, of trans content, etc and is used to create enemies lists in the west for the coming crackdown and violence they’ll inflict and to exercise total control over electronic communications and a chilling effect on any questioning of the official narratives. The issue also exists of over-detection and no real care for appeal or justice. In a decent state hypothetically a false detection wouldn’t ruin your life. Meanwhile in the west Google seeing you sharing a naked photo of your child with their doctor will NEVER reverse their decision to terminate your account and ban you for life over it and does not care a bit about giving you appeal, fairness, etc. So there will be innocent casualties of things that aren’t even nudity or children and those people risk being digitally un-personed as a result to say nothing of being raided by the police and put through the ringer of their house being searched, electronics seized and not returned for weeks/months, etc. And that doesn’t matter to these companies or the government and that’s the problem.





