Sammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 1 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square47fedilinkarrow-up1115arrow-down164 cross-posted to: technology@lemmy.worldITNscience_and_technology@fedinews.netITNsocial_issues@fedinews.nettechnology@chat.maiion.comtechnology@lemmy.mlfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
arrow-up151arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comSammeeeeeee@lemmy.world to Technology@lemmy.worldEnglish · 1 year agomessage-square47fedilink cross-posted to: technology@lemmy.worldITNscience_and_technology@fedinews.netITNsocial_issues@fedinews.nettechnology@chat.maiion.comtechnology@lemmy.mlfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
minus-squarewhenigrowup356@lemmy.worldlinkfedilinkEnglisharrow-up6·1 year agoShouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
minus-squareozymandias117@lemmy.worldlinkfedilinkEnglisharrow-up4·1 year agoThose databases are highly regulated, as they are, themselves CSAM Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all
Shouldn’t it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?
Those databases are highly regulated, as they are, themselves CSAM
Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all