TGhost [She/Her]@lemmy.ml to Privacy@lemmy.ml · vor 10 MonatenGoogle Agrees to Delete Billions of Files Collected in Chrome Incognitorestoreprivacy.comexternal-linkmessage-square58fedilinkarrow-up1322arrow-down12cross-posted to: technologie@jlai.lu
arrow-up1320arrow-down1external-linkGoogle Agrees to Delete Billions of Files Collected in Chrome Incognitorestoreprivacy.comTGhost [She/Her]@lemmy.ml to Privacy@lemmy.ml · vor 10 Monatenmessage-square58fedilinkcross-posted to: technologie@jlai.lu
minus-square__init__@programming.devlinkfedilinkarrow-up87arrow-down1·vor 10 MonatenYes, this will definitely happen
minus-squarePumpkin Escobar@lemmy.worldlinkfedilinkEnglisharrow-up49·edit-2vor 10 Monaten Step 1: move data from files into a database Step 2: delete files Step 3: press release that we just deleted the files Google, probably
minus-squareBeatTakeshi@lemmy.worldlinkfedilinkarrow-up28·edit-2vor 10 MonatenCan they even untrain an IA with a given set of Data? “Bard, forget this ever existed” “Sure, I’ll make a copy so I won’t forget to forget”
minus-squareflying_wotsit@lemmy.blahaj.zonelinkfedilinkarrow-up11·vor 10 MonatenMachine Unlearning is a very active field right now but basically no not really
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up5arrow-down1·vor 10 MonatenYou laugh but it would be really dumb for this not to happen. They would be held liable and most of the data they collect doesn’t come from incognito
minus-squaresnooggums@midwest.sociallinkfedilinkEnglisharrow-up26·vor 10 MonatenSure they will be held liable. Just like everything else they are held liable for.
minus-squarePossibly linux@lemmy.ziplinkfedilinkEnglisharrow-up6arrow-down1·vor 10 MonatenThey aren’t liable for much
minus-squareDupree878@lemmy.worldlinkfedilinkarrow-up7arrow-down1·vor 10 MonatenThat’s @snooggums@kbin.social’s point
minus-squareBeatTakeshi@lemmy.worldlinkfedilinkarrow-up2·vor 10 MonatenWhy isn’t there a sarcasm flare on lemmy (heck, on the Internet)
minus-squareRai@lemmy.dbzer0.comlinkfedilinkarrow-up2·vor 10 MonatenBack in my day, we didn’t need it! Sarcasm was much more understood on LUE, Gen[M]ay, SA, and DDRFreak! (I haven’t been to these sites in a long time, so please don’t crucify me if they’re shitholes now. Other than DDRFreak. RIP DDRFreak)
minus-squaresadreality@kbin.sociallinkfedilinkarrow-up2arrow-down1·vor 10 MonatenThat’s the point… so your original statement does not carry much weight.
minus-squareScolding0513@sh.itjust.workslinkfedilinkarrow-up2arrow-down1·vor 10 Monatenyou are way too optimistic my man, haha
Yes, this will definitely happen
Step 1: move data from files into a database
Step 2: delete files
Step 3: press release that we just deleted the files
Google, probably
Can they even untrain an IA with a given set of Data?
“Bard, forget this ever existed”
“Sure, I’ll make a copy so I won’t forget to forget”
Machine Unlearning is a very active field right now but basically
nonot reallyYou laugh but it would be really dumb for this not to happen. They would be held liable and most of the data they collect doesn’t come from incognito
Sure they will be held liable. Just like everything else they are held liable for.
They aren’t liable for much
That’s @snooggums@kbin.social’s point
Why isn’t there a sarcasm flare on lemmy (heck, on the Internet)
/s
Back in my day, we didn’t need it! Sarcasm was much more understood on LUE, Gen[M]ay, SA, and DDRFreak!
(I haven’t been to these sites in a long time, so please don’t crucify me if they’re shitholes now. Other than DDRFreak. RIP DDRFreak)
That’s the point… so your original statement does not carry much weight.
you are way too optimistic my man, haha