I waddled onto the beach and stole found a computer to use.

🍁⚕️ 💽

Note: I’m moderating a handful of communities in more of a caretaker role. If you want to take one on, send me a message and I’ll share more info :)

  • 1.05K Posts
  • 3.13K Comments
Joined 1 year ago
cake
Cake day: June 5th, 2023

help-circle











  • At the same time, the variables in that calculation might change over time. If it becomes easy enough for them to support it, or the costs of not supporting it get too high, they might change their minds.

    Alternatively: wean yourself and your friends off of snapchat. In my part of the world, snapchat isn’t popular anymore. It doesn’t offer anything new and so barely anyone uses it.








  • This part of the interview felt relevant to the fediverse (note that this was pasted from a transcript, and you might find it easier to watch the video than read the transcript):

    Australia’s safety commissioner recently took on Elon Musk for example requesting the removal of vision of a stabbing in a church here in Sydney. It was unsuccessful, should tech platforms be held responsible for spreading that sort of content.

    Well I think we need to break that question down and actually question the form that tech platforms have taken, because we live in a world right now where there are about five major social media platforms that are very literally shaping the global information environment for everyone. So we have a context where these for-profit surveillance tech actors have outsized control over our information environment, and present a very very attractive political target to those who might want to shape, or misshape, that information environment. So I think we need to go to the root of the problem. The issue is not that every regulator doesn’t get a chance to determine appropriate or inappropriate content. The issue is that we have a one-size fits all approach to our shared information ecosystem, and that these companies are able to determine what we see or not, via algorithms that are generally calibrated to increase engagement; to promote more hyperbolic or more inflammatory content, and that we should really be attacking this problem at the root: beginning to grow more local and rigorous journalism outside of these platforms and ensuring that there are more local alternatives to the one-size fits-all surveillance platform business model.


  • This part of the interview felt relevant to the fediverse (note that this was pasted from a transcript, and you might find it easier to watch the video than read the transcript):

    Australia’s safety commissioner recently took on Elon Musk for example requesting the removal of vision of a stabbing in a church here in Sydney. It was unsuccessful, should tech platforms be held responsible for spreading that sort of content.

    Well I think we need to break that question down and actually question the form that tech platforms have taken, because we live in a world right now where there are about five major social media platforms that are very literally shaping the global information environment for everyone. So we have a context where these for-profit surveillance tech actors have outsized control over our information environment, and present a very very attractive political target to those who might want to shape, or misshape, that information environment. So I think we need to go to the root of the problem. The issue is not that every regulator doesn’t get a chance to determine appropriate or inappropriate content. The issue is that we have a one-size fits all approach to our shared information ecosystem, and that these companies are able to determine what we see or not, via algorithms that are generally calibrated to increase engagement; to promote more hyperbolic or more inflammatory content, and that we should really be attacking this problem at the root: beginning to grow more local and rigorous journalism outside of these platforms and ensuring that there are more local alternatives to the one-size fits-all surveillance platform business model.