- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
cross-posted from: https://kbin.social/m/technology@lemmy.world/t/525635
A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.
Oh no, if it isn’t the consequences of their actions.
Really shouldn’t have used training data that was obtained without consent.