Emotional_Series7814

  • 11 Posts
  • 62 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle
  • I’m still new enough I can’t make needle recommendations, but my first project was a garter stitch scarf. 10 stitches on size 13 needles… yeah… that wasn’t much of a scarf.

    when my mom tried to teach me to cast on for knitting I went cross eyed.

    Similar experience to you with learning how to crochet. Right now all I can do is a chain stitch. And nothing else. No adding any height to the chain, just making a single long chain.













  • I don’t consciously make these calculations either, but what you just described sounds exactly like how I choose what to click on. Also came here for suggestions!

    I’ll say that I’ve looked up hobbies I enjoy but don’t think about much so I can boost my engagement on the Fediverse. Normally I wouldn’t bother, but I want to help this place grow, so I’ve let in things that I have a milder interest in as well as my usual interests. This is also how I get variety in the posts I see, as I usually stick to /sub. When I wander out, it’s on purpose and to a specific known community, because /all usually has some depressing political news or ragebait that would get me to outrage-click. I’m here to have a good time, not to doomscroll or get angry. Kbin has no algorithm intended to keep us scrolling on it, but those things do generate the most engagement, so it’s only natural they end up on /all frequently enough (though not as frequently as they’d appear on the popular page on Reddit) that I feel a desire to avoid /all.











  • “We believe that users should have a say in how their attention is directed, and developers should be free to experiment with new ways of presenting information,” Bluesky’s chief executive, Jay Graber, told me in an email message.

    Of course, there are also challenges to algorithmic choice. When the Stanford political science professor Francis Fukuyama led a working group that in 2020 proposed outside entities offer algorithmic choice, critics chimed in with many concerns.

    Robert Faris and Joan Donovan, then of Harvard’s Shorenstein Center, wrote that they were worried that Fukuyama’s proposal could let platforms off the hook for their failures to remove harmful content. Nathalie Maréchal, Ramesh Srinivasan and Dipayan Ghosh argued that his approach would do nothing to change the some tech platforms’ underlying business model that incentivizes the creation of toxic and manipulative content.

    Mr. Fukuyama agreed that his solution might not help reduce toxic content and polarization. “I deplore the toxicity of political discourse in the United States and other democracies today, but I am not willing to try solving the problem by discarding the right to free expression,” he wrote in response to the critics.

    When she ran the ethics team at Twitter, Rumman Chowdhury developed prototypes for offering users algorithmic choice. But her research revealed that many users found it difficult to envision having control of their feed. “The paradigm of social media that we have is not one in which people understand having agency,” said Ms. Chowdhury, whose Twitter team was let go when Mr. Musk took over. She went on to found the nonprofit Humane Intelligence.

    But just because people don’t know they want it doesn’t mean that algorithmic choice is not important. I didn’t know I wanted an iPhone until I saw one.

    And with another national election looming and disinformation circulating wildly, I believe that asking people to choose disinformation — rather than to accept it passively — would make a difference. If users had to pick an antivaccine news feed, and to see that there are other feeds to choose from, the existence of that choice would itself be educational.

    Algorithms make our choices invisible. Making those choices visible is an important step in building a healthy information ecosystem.