TGhost [She/Her]@lemmy.ml to Privacy@lemmy.ml · 9 months agoGoogle Agrees to Delete Billions of Files Collected in Chrome Incognitorestoreprivacy.comexternal-linkmessage-square58fedilinkarrow-up1322arrow-down12cross-posted to: technologie@jlai.lu
arrow-up1320arrow-down1external-linkGoogle Agrees to Delete Billions of Files Collected in Chrome Incognitorestoreprivacy.comTGhost [She/Her]@lemmy.ml to Privacy@lemmy.ml · 9 months agomessage-square58fedilinkcross-posted to: technologie@jlai.lu
minus-square__init__@programming.devlinkfedilinkarrow-up87arrow-down1·9 months agoYes, this will definitely happen
minus-squarePumpkin Escobar@lemmy.worldlinkfedilinkEnglisharrow-up49·edit-29 months ago Step 1: move data from files into a database Step 2: delete files Step 3: press release that we just deleted the files Google, probably
minus-squareBeatTakeshi@lemmy.worldlinkfedilinkarrow-up28·edit-29 months agoCan they even untrain an IA with a given set of Data? “Bard, forget this ever existed” “Sure, I’ll make a copy so I won’t forget to forget”
minus-squareflying_wotsit@lemmy.blahaj.zonelinkfedilinkarrow-up11·9 months agoMachine Unlearning is a very active field right now but basically no not really
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up5arrow-down1·9 months agoYou laugh but it would be really dumb for this not to happen. They would be held liable and most of the data they collect doesn’t come from incognito
minus-squaresnooggums@midwest.sociallinkfedilinkEnglisharrow-up26·9 months agoSure they will be held liable. Just like everything else they are held liable for.
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up6arrow-down1·9 months agoThey aren’t liable for much
minus-squareDupree878@lemmy.worldlinkfedilinkarrow-up7arrow-down1·9 months agoThat’s @snooggums@kbin.social’s point
minus-squareBeatTakeshi@lemmy.worldlinkfedilinkarrow-up2·9 months agoWhy isn’t there a sarcasm flare on lemmy (heck, on the Internet)
minus-squareRai@lemmy.dbzer0.comlinkfedilinkarrow-up2·9 months agoBack in my day, we didn’t need it! Sarcasm was much more understood on LUE, Gen[M]ay, SA, and DDRFreak! (I haven’t been to these sites in a long time, so please don’t crucify me if they’re shitholes now. Other than DDRFreak. RIP DDRFreak)
minus-squaresadreality@kbin.sociallinkfedilinkarrow-up2arrow-down1·9 months agoThat’s the point… so your original statement does not carry much weight.
minus-squareScolding0513@sh.itjust.workslinkfedilinkarrow-up2arrow-down1·9 months agoyou are way too optimistic my man, haha
Yes, this will definitely happen
Step 1: move data from files into a database
Step 2: delete files
Step 3: press release that we just deleted the files
Google, probably
Can they even untrain an IA with a given set of Data?
“Bard, forget this ever existed”
“Sure, I’ll make a copy so I won’t forget to forget”
Machine Unlearning is a very active field right now but basically
nonot reallyYou laugh but it would be really dumb for this not to happen. They would be held liable and most of the data they collect doesn’t come from incognito
Sure they will be held liable. Just like everything else they are held liable for.
They aren’t liable for much
That’s @snooggums@kbin.social’s point
Why isn’t there a sarcasm flare on lemmy (heck, on the Internet)
/s
Back in my day, we didn’t need it! Sarcasm was much more understood on LUE, Gen[M]ay, SA, and DDRFreak!
(I haven’t been to these sites in a long time, so please don’t crucify me if they’re shitholes now. Other than DDRFreak. RIP DDRFreak)
That’s the point… so your original statement does not carry much weight.
you are way too optimistic my man, haha