• istanbullu@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    3
    ·
    edit-2
    3 months ago

    I don’t get the AI hate. It reminds me of the hate for the digital editing tools like Photoshop back 20 years ago.

    • Lime Buzz@beehaw.org
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      3 months ago

      It uses too much power.

      They steal from those that actually create work and don’t pay them for the [sarcasm] privellege [/sarcasm]. People have to eat and as creatives we need to be paid for our work, or those feeding the models data need to create the work themselves which they can’t do because most of them have invested 0 time or effort into actually being creative in that way. Which is why they do this, because most of them have no actual skill and are envious of those who do.

      I also, just in general don’t want my conversations to be used to fuel some bot that makes money for someone else. It’s bad for privacy and also, what am I getting out of it? Nothing materially beneficial that I can see and it’ll probably endanger mine or someone else’s life.

      Its results are often bad, as in inaccurate or downright dangerous.

      It’s just the latest line in a long list of scams designed to give money and power to those on the top of the pile already, see: Cryptocurrency, nfts, loot boxes, microtransactions etc.

        • Lime Buzz@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          3 months ago

          When I bring this up, people keep talking about copyright. I’m not really sure why as I don’t explicitly say anything about copyright. I’m personally against copyright, what I’m for is two things:

          Creative people, under a system where they need money to live and to continue to create art of all types to get paid and consent.

          The first is a very important point because, if you don’t have to pay an actual artist to create things then they won’t get paid and thus won’t be able to live, that’s why Machine Learning is bad under capitalism.

          The second is also very important but that tech obsessives and big companies time and time again don’t care enough about. Consent should be sought explicitly before any data is used, time and time again this is not done, so, the next point about consent should be brought up in that consent should be able to be revoked at any point for any reason without consequences for the revoker.

          That’s what I care about, not copyright in and of itself, I care about people getting paid at a price they’ve set for every piece of data that’s fed into the machines and explicit revocable consent being given to train the machines, that’s it.

          If those two conditions are met then personally then I’d likely have no problem with it.

          Edit: Yes, I also agree a lot of the ‘creative industrial complex’ is also bad and should be abolished, a lot of the companies that control the ability for artists to gain access to mass reproduction of their work and mass distribution I am also very much against.

          The internet has been somewhat of a levelling space in regards to this. However, such things like spotify etc are also bad as they are owned by the big companies and so pay the artists little and get away with it, which is why I always do my best to pay the artist directly and buy their music, if film and tv shows were stripped of DRM and I could just buy the files directly I’d do that too.

          Edit 2: I’m very glad for things like ko-fi and comradery in that they’re trying to be good, non venture capitalist backed funding sources for people and I think that’s wonderful and more artists deserve to get paid on those. I personally think patreon is bad and fewer people should use it because as they’ve shown, since they’re venture capitalist backed, VCs are their real customers, not those using it to get paid.

          • Even_Adder@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Did you read the first one?

            Making quantitative observations about works is a longstanding, respected and important tool for criticism, analysis, archiving and new acts of creation. Measuring the steady contraction of the vocabulary in successive Agatha Christie novels turns out to offer a fascinating window into her dementia: https://www.theguardian.com/books/2009/apr/03/agatha-christie-alzheimers-research

            The final step in training a model is publishing the conclusions of the quantitative analysis of the temporarily copied documents as software code. Code itself is a form of expressive speech – and that expressivity is key to the fight for privacy, because the fact that code is speech limits how governments can censor software: https://www.eff.org/deeplinks/2015/04/remembering-case-established-code-speech/

            What you want would give Disney broad powers to oppressively control large amounts of popular discourse. I acknowledge that Specific expressions deserve protection and should retain specific rights, and rights they don’t have always enabled ethical self-expression and productive dialogue. Wanting to bar others from analyzing your work to keep them from iterating on your ideas or expressing the same ideas differently is both is selfish and harmful.

            You’re against the type of system you desperately want to become. Using things “without permission” forms the bedrock on which artistic expression and free speech as a whole are built upon. I don’t think any state is going to pass a law that guts the core freedoms of art, research, and basic functionality of the internet and computers.

            • Lime Buzz@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              3 months ago

              I’ll admit, it was kind of long and difficult to read for some reason, so I kind of started and then didn’t read everything in it, maybe I’ll try again later.

              Okay, that’s fair, I don’t want the ‘creative industrial complex’ like disney etc to gain more power, sorry if I came off incorrectly. I can see the flaws in my argument now, but machine learning/LLMs do make me angry and upset because sure, if a person is analysing my work, that’s fine, I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create. I actually release most of my work very permissively, but I still don’t want it to train some model, I’m happy if people are inspired by it or do it better than I can though.

              The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

              • Even_Adder@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                2
                ·
                3 months ago

                I just don’t particularly want that work to be used to make new work without the skills necessary to do so well, LLMs/Machine Learning cannot gain those skills because it is not alive and thus it cannot create.

                This kind of sentiment saddens me. People can’t look past the model and see the person who put in their time, passion, and knoledge to make it. You’re begrudging someone who took a different path in life, spent their irreplaceable time acquiring different skills and applied them to achieve something they wanted. Because of that, they don’t deserve it, as they didn’t do it the same way you did, with the same opportunities and materials.

                The article mentions though that using things ‘without permission’ is how a lot of people became and remain(ed) poor, especially people from marginalised communities, likely from those in power so again, I think we’re on the same page there?

                We are, but that’s just one symptom of a larger exploitative system where the powerful can extract while denying opportunities to the oppressed. AI training isn’t only for mega-corporations. We shouldn’t put up barriers that only benefit the ultra-wealthy and hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people. Mega-corporations already own datasets, and have the money to buy more. And that’s before their predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.

                Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started. We need to make sure this remains a two-way street, corporations have so much to lose, and we, everything to gain. Just look at the floundering cinema industry, weak cable and TV numbers and print media.

    • Elise@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      3 months ago

      I’m mostly on your side but I did just cancel my gpt4 subscription.

      There’s several factors but the main one is that it just uses a whole lot of power and materials eventhough I don’t really need it.

      For example it helped me learn about electronics, and it was effective at that. But I feel it’s more efficient to just buy an ebook. It just feels slightly less convenient, but actually is healthier for my focus.

      It’s kinda like with bitcoin. It isn’t a net positive given our current situation. It’s a waste of the scarce resources we have. And we need to get to net 0 ASAP and stop mining like there’s no tomorrow.

      The other thing it was good at was searching information and providing it in a uniform format, rather than the mess that is the web rn. But installing Firefox and a bunch of extensions solved most of that. And search engines allow for generating an LLM response when I feel it would really help, so that fills the gap.

    • FlorianSimon@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      3 months ago

      There are multiple reasons:

      • for plenty of use cases where it’s supposed to help, it just plain doesn’t work (software engineering being my main use case, but it does not help find obscure songs/games either).
      • it’s fundamentally unsafe, which matters in a lot of cases where the evangelists want to put AI into.
      • the resource usage to train models is crazy. To the point of delaying Google’s carbon neutrality plans for instance. It’s also expected to put a significant toll on energy grids worldwide for the years to come, which is the last thing we need on a burning planet.
      • it’s being pushed by evil actors like big tech billionaires, who aren’t trusted to do the right things with the tech.
      • it’s already proven harmful (cf Air Canada’s chatbot, or the idiots on today’s other HN LLM subject saying they use it for medical advice or electric work among many examples)
      • it’s overhyped, much like crypto. Way too many promises, and it does not deliver.

      My sentiment on the reliability is shared in my team, among people that used it a bit more: it’s a garbage machine.

      I do fear it might train a generation of software professionals that don’t know how to code, which is going to harm them (unemployable) or the people they serve, but I might be overreacting due to the fact that the only a person I knew who claimed to use LLMs professionally was a hack who’s using LLMs as a palliative for general lack of skills and low output. Come to think of it, this is precisely the kind of people that should be cautious around LLMs, because they can’t review the LLM’s output accurately for dangerous hallucinations.

      I do ask ChatGPT questions sometimes, but honestly pretty rarely. I use it as a complement to regular search, and nothing more, because it can’t get the basics right most of the time.