Sapphire Velvet@lemmynsfw.com to Technology@lemmy.worldEnglish · 11 months agoChild sex abuse images found in dataset training image generators, report saysarstechnica.comexternal-linkmessage-square12fedilinkarrow-up1100arrow-down110file-textcross-posted to: technology@lemmy.zipghazi@lemmy.blahaj.zone
arrow-up190arrow-down1external-linkChild sex abuse images found in dataset training image generators, report saysarstechnica.comSapphire Velvet@lemmynsfw.com to Technology@lemmy.worldEnglish · 11 months agomessage-square12fedilinkfile-textcross-posted to: technology@lemmy.zipghazi@lemmy.blahaj.zone
The report: https://stacks.stanford.edu/file/druid:kh752sm9123/ml_training_data_csam_report-2023-12-20.pdf
minus-squareSapphire Velvet@lemmynsfw.comOPlinkfedilinkEnglisharrow-up3arrow-down2·11 months agoThey’re not looking at the images though. They’re scraping. And their own legal defenses rely on them not looking too carefully else they cede their position to the copyright holders.
minus-squaresnooggums@kbin.sociallinkfedilinkarrow-up5arrow-down1·11 months agoTechnically they violated the copyright of the CSAM creators!
They’re not looking at the images though. They’re scraping. And their own legal defenses rely on them not looking too carefully else they cede their position to the copyright holders.
Technically they violated the copyright of the CSAM creators!