• schnurrito
    link
    fedilink
    arrow-up
    2
    ·
    1 month ago

    I’ve recently said this in another thread, and I’ll repeat it here: this problem would easily be solved by changing content liability laws (e.g. section 230 in the US) so that anything recommended by an algorithm counts as speech by the platform and the platform is liable for it if it turns out to be illegal (e.g. libellous).

    That would mean that you could operate a forum or wiki or Lemmy or Mastodon instance without worrying about liability, but Facebook, YouTube, TikTok would have to get rid of the feature where they put “things that might interest you” that you didn’t actually choose to follow into your feed.

    None of that has anything to do with anyone’s age.

    • silentdon@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      This could work as long as “algorithm” is sufficiently defined. Someone could argue that a sorting “algorithm” counts.

      • schnurrito
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        Agreed. This is a potential problem, but not an unsolvable one.