• Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Did you read the first one?

    Making quantitative observations about works is a longstanding, respected and important tool for criticism, analysis, archiving and new acts of creation. Measuring the steady contraction of the vocabulary in successive Agatha Christie novels turns out to offer a fascinating window into her dementia: https://www.theguardian.com/books/2009/apr/03/agatha-christie-alzheimers-research

    The final step in training a model is publishing the conclusions of the quantitative analysis of the temporarily copied documents as software code. Code itself is a form of expressive speech – and that expressivity is key to the fight for privacy, because the fact that code is speech limits how governments can censor software: https://www.eff.org/deeplinks/2015/04/remembering-case-established-code-speech/

    What you want would give Disney broad powers to oppressively control large amounts of popular discourse. I acknowledge that Specific expressions deserve protection and should retain specific rights, and rights they don’t have always enabled ethical self-expression and productive dialogue. Wanting to bar others from analyzing your work to keep them from iterating on your ideas or expressing the same ideas differently is both is selfish and harmful.

    You’re against the type of system you desperately want to become. Using things “without permission” forms the bedrock on which artistic expression and free speech as a whole are built upon. I don’t think any state is going to pass a law that guts the core freedoms of art, research, and basic functionality of the internet and computers.

    • Lime Buzz@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 months ago

      I’ll admit, it was kind of long and difficult to read for some reason, so I kind of started and then didn’t read everything in it, maybe I’ll try again later.

      Okay, that’s fair, I don’t want the ‘creative industrial complex’ like disney etc to gain more power, sorry if I came off incorrectly. I can see the flaws in my argument now, but machine learning/LLMs do make me angry and upset because sure, if a person is analysing my work, that’s fine, I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create. I actually release most of my work very permissively, but I still don’t want it to train some model, I’m happy if people are inspired by it or do it better than I can though.

      The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create.

        This kind of sentiment saddens me. People can’t look past the model and see the person who put in their time, passion, and knoledge to make it. You’re begrudging someone who took a different path in life, spent their irreplaceable time acquiring different skills and applied them to achieve something they wanted. Because of that, they don’t deserve it, as they didn’t do it the same way you did, with the same opportunities and materials.

        The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

        We are, but that’s just one symptom of a larger exploitative system where the powerful can extract while denying opportunities to the oppressed. AI training isn’t only for mega-corporations. We shouldn’t put up barriers that only benefit the ultra-wealthy and hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people. Mega-corporations already own datasets, and have the money to buy more. And that’s before their predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.

        Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started. We need to make sure this remains a two-way street, corporations have so much to lose, and we, everything to gain. Just look at the floundering cinema industry, weak cable and TV numbers and print media.