• stevedidwhat_infosec@infosec.pub
      link
      fedilink
      arrow-up
      14
      arrow-down
      4
      ·
      9 months ago

      I mean literally this. AI is a strong tool when used properly, but we should obviously not lose sight of long term goals in favor of short term opportunities

      • BirdEnjoyer@kbin.social
        link
        fedilink
        arrow-up
        9
        arrow-down
        3
        ·
        9 months ago

        Yep.

        This was never the fault of AI.
        Its always the corporations- by nature, they’re designed to go with what gets the most profit at any cost.
        Just look at what they do to the meat industry to living creatures and their living employees who have to deal with it.

        I don’t think people appreciate just how dangerously willing these current AI companies are to set fire to the upcoming few decades of society- all for a little bit of glory right now.

        The energy problem is definitely one thing I didn’t realize was quite so dire, but we’re on the cusp of total loss of control over your own likeness, and these companies really couldn’t care less.

    • wise_pancake@lemmy.ca
      link
      fedilink
      arrow-up
      4
      arrow-down
      5
      ·
      9 months ago

      Ai has replaced so much of what I used to Google search for, but without the blog fluff (though it adds its own flavour of fluff).

      All of it is low stakes, so I’m not worried about the accuracy as long as it keeps me moving on a task.

      • insomniac_lemon@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I tried this recently in hopes of finding an animation pilot, it was too willing to give me completely wrong answers (the most popular things or even kids shows) or it’d just make a name up. Admittedly, I was using 13b.Q4 models and they are not the newest ones.

        I ended up finding what I was looking for by pure coincidence: I did a generic search (finding adult swim pilots (I had combed the wikipedia page and their site already)) and one of the higher results is a reddit thread where someone was looking for the same show I was and they made the same mistake that I made (mistaking a Cartoon Hangover short for an Adult Swim pilot).

        After that I tried finding an even older and dumber animation that I had gotten on the PSN during the PS3 era, those terms tripped the AI up because it would only give me videogames.

        (Certain things are probably better to ask, I’d say I’m not sure about computation being worth it but then again search is pretty garbage these days unless it’s an obvious query that won’t be mixed up with other newer/more-popular terms)

  • Zaktor@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    edit-2
    9 months ago

    de Vries, who now works for the Netherlands’ central bank, estimated that if Google were to integrate generative A.I. into every search, its electricity use would rise to something like twenty-nine billion kilowatt-hours per year. This is more than is consumed by many countries, including Kenya, Guatemala, and Croatia.

    Why on earth would they do that? Just cache the common questions.

    It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

    Ok, so the actual real world estimate is somewhere on the order of a million kilowatt-hours, for the entire globe. Even if we assume that’s just US, there are 125M households, so that’s 4 watt-hours per household per day. A LED lightbulb consumes 8 watts. Turn one of those off for a half-hour and you’ve balanced out one household’s worth of ChatGPT energy use.

    This feels very much in the “turn off your lights to do you part for climate change” distraction from industry and air travel. They’ve mixed and matched units in their comparisons to make it seem like this is a massive amount of electricity, but it’s basically irrelevant. Even the big AI-every-search number only works out to 0.6 kwh/day (again, if all search was only done by Americans), which isn’t great, but is still on the order of don’t spend hours watching a big screen TV or playing on a gaming computer, and compares to the 29 kwh already spent.

    Math, because this result is so irrelevant it feels like I’ve done something wrong:

    • 500,000 kwh/day / 125,000,000 US households = 0.004 kwh/household/day
    • 29,000,000,000 kwh/yr / 365 days/yr / 125,000,000 households = 0.6 kwh/household/day, compared to 29 kwh base
    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      Just cache the common questions.

      There are only two hard things in Computer Science: cache invalidation and naming things.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        You mean: two hard things - cache invalidation, naming things and off-by-one errors

        • kibiz0r@midwest.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          Reminds me of the two hard things in distributed systems:

          • 2: Exactly-once delivery
          • 1: Guaranteed order
          • 2: Exactly-once delivery
      • Zaktor@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        It’s a good thing that Google has a massive pre-existing business about caching and updating search responses then. The naming things side of their business could probably use some more work though.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      9 months ago

      Just cache the common questions.

      AI models work in a feedback loop. The fact that you’re asking the question becomes part of the response next time. They could cache it, but the model is worse off for it.

      Also, they are Google/Microsoft/OpenAI. They will do it because they can and nobody is stopping them.

      • Zaktor@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        This is AI for search, not AI as a chatbot. And in the search context many requests are functionally similar and can have the same response. You can extract a theme to create contextual breadcrumbs that will be effectively the same as other people doing similar things. People looking for Thai food in Los Angeles will generally follow similar patterns and need similar responses, even if it comes in the form of several successive searches framed as sentences with different word ordering and choices.

        And none of this is updating the model (at least not in a real-time sense that would require re-running a cached search), it’s all short-term context fed in as additional inputs.

  • mindbleach@sh.itjust.works
    link
    fedilink
    arrow-up
    20
    arrow-down
    2
    ·
    9 months ago

    ‘How dare technology keep doing stuff?’ is a deeply weird criticism.

    This isn’t like crypto bullshit, where finance-bro jackasses did databases in the least efficient possible way. We’re pushing the boundaries of results-driven artificial intelligence, modeled on how biological brains work. Is it miraculous? Not exactly. But it’s answering a lot of questions that were exciting forty-odd years ago and suddenly exploded into relevance due to parallel computing… intended for video games.

    Bemoaning the last year-ish of outright witchcraft, based on the up-front costs of training models that will run on a phone, is a perspective that seems more performative than plausible.

    • stabby_cicada@slrpnk.net
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      9 months ago

      I deeply dislike the line of argument that goes “we shouldn’t bother reducing our personal energy consumption because 100 corporations produce 70% of greenhouse gases” or similar arguments. Of course we should. Because it’s the right thing to do.

      But it’s also true: those 100 corporations and their ilk absolutely promote a false narrative that personal responsibility is the solution to climate change, in order to prevent climate regulation that might harm their bottom line.

      And frankly, I think that’s what’s going on here with panic over AI power consumption. Corporate lobbyists and PR creating yet another distraction to slow the course of climate regulation and guilting ordinary people for doing ordinary things in the process.

      • skuzz
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Personal responsibility has always been capitalism’s mechanism for normalizing corpo behavior. The fake Native American trash commercial in the 70s, banning home cleaners that business can still use at industrial scale, buying new electric cars being somehow carbon better than just not being a vehicle consumer every five minutes, there are examples going even further back in time, but my brain doesn’t currently have enough caffeine to dig further back.

      • mindbleach@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Considering one crosspost for this is the sneer-club hypocrites at awful.systems, there’s also the interplay of bad-faith criticism and bad-faith excuses, for their own sake. Individual randos have picked an allegiance and will now engage in kneejerk loyalist ad-hoc justification, because they think that’s how things work. Going from arguments to conclusions would be ridiculous, to them. Their claims are not intended to be evaluated.

        Myriad douchebags have jumped from crypto to AI as the next buzzword cult that might make them hideously rich. People rightly condemning them also tend to jab at whatever bullshit they’re pushing, now. Some of that’s going to be legitimate perspective on an over-hyped autocomplete, powered by copying every book in the library, using racks of video cards running full-tilt 24/7. Some of that’s going to be inane performative mockery of creative tools that border on magical and should scale down to translate speech right in your earbuds. All I can guarantee for you is that the aforementioned douchebags will not know the difference. Neither will their loudest critics. They’ll both say ‘you just don’t understand!’ because they’re just shuffling cards and this deck is not deep.

  • Immersive_Matthew@sh.itjust.works
    link
    fedilink
    arrow-up
    16
    arrow-down
    4
    ·
    9 months ago

    The issue is how the electricity is generated not that it is needed in the first place. Such a great distraction from the real issue that it has got to be big oil that is spinning the story this way.

    Let’s all hate on AI and crypto because they are ruining the entire environment and if we just stopped them, all would be fine with the planet again /s.

  • BlanketsWithSmallpox@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    edit-2
    9 months ago

    Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.

    So 500 Megawatts a day across the globe? This is all just Data Center use? Not even 1/10th the power of the newest and largest data center’s power… out of ~11,000 total data centers.

    Existing markets are already struggling to meet demand, the report says. In Northern Virginia, the largest data center market in the world at 3,400MW, availability is running at just 0.2 percent.

    https://www.datacenterdynamics.com/en/news/us-data-center-power-consumption/

    So a drop in the bucket for a crazy useful tool using mostly existing infrastructure…

    The finding that global data centers likely consumed around 205 terawatt-hours (TWh) in 2018, or 1 percent of global electricity use, lies in stark contrast to earlier extrapolation-based estimates that showed rapidly-rising data center energy use over the past decade (Figure 2).

    https://energyinnovation.org/2020/03/17/how-much-energy-do-data-centers-really-use/

    The typical cost of building a solar power plant is between $0.89 and $1.01 per watt. A 1MW (megawatt) solar farm can cost you between $890,000 and $1.01 million… According to GTM Research, 1 MW solar farms require 6–8 acres to accommodate all the necessary infrastructure and space between panel rows.

    https://coldwellsolar.com/commercial-solar-blog/how-much-investment-do-you-need-for-a-solar-farm/

    $300 million and ~2 square miles (7 for reference) to power the entire world’s AI use feels like a non-issue to me. A billionaire could literally fund the entire world’s daily consumption and not dent their holdings…

    Computers use power… More news at 11.

  • stabby_cicada@slrpnk.net
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    9 months ago

    So I did a little math.

    This site says a single ChatGPT query consumes 0.00396 KWh.

    Assume an average LED light bulb is 10 watts, or 0.01 kwh/hr. So if I did the math right, no guarantees there, a single ChatGPT query is roughly equivalent to leaving a light bulb on for 20 minutes.

    So if you assume the average light bulb in your house is on a little more than 3 hours a day, if you make 10 ChatGPT queries per day it’s the equivalent of adding a new light bulb to your house.

    Which is definitely not nothing. But isn’t the end of the world either.

    • AliasAKA@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      9 months ago

      It’s also the required energy to train the model. Inference is usually more efficient (sometimes not but almost always significantly more so), because you have no error back propagation or other training specific calculations.

      Models probably take 1000 megawatts of energy to train (GPT3 took 284MW by OpenAI’s calculation). That’s not including the web scraping and data cleaning and other associated costs (such as cooling the server farms which is non trivial).

      A coal plant takes roughly 364kg - 500kg of coal to generate 1 MWh. So for GPT3 you’d be looking at 103,376 kg (~230 thousand pounds, or 115 US tons) at minimum to train it. Nobody has used it and we’re not looking at the other associated energy costs at this point. For comparison, a typical home may use 6MWh per year. So just training GPT3 could’ve powered 47 homes for an entire year.

      Edit: also, it’s not nearly as bad as crypto mining. And as another person says it’s totally moot if we have clean sources of energy to fill the need and the grid can handle it. Unfortunately we have neither right now.

      • sushibowl@feddit.nl
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        If you amortize training costs over all inference uses, I don’t think 1000MW is too crazy. For a model like GPT3 there’s likely millions of inference calls to split that cost between.

        • AliasAKA@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Sure, and I think that these may even be useful and it warrants the cost. But it is to just say that this still isn’t simply running a couple light bulbs or something. This is a major draw on the grid (but likely still pales in comparison to crypto farms).

          Note that most people would be better off using a model that’s trained for a specific task. For example, training image recognition uses vastly less energy because the models are vastly smaller, but they’re exceedingly excellent at image recognition.

          • Zaktor@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            The article claims 200M ChatGPT requests per day. Assuming they make a new version yearly, that’s 73B requests per training. Spreading 1000MW across 73B requests yields a per-request amortized cost of 0.01 watt. It’s nothing.

            47 more households-worth of electricity just isn’t a major draw on anything. We add ~500,000 households a year from natural growth.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      I have a feeling it’s not going to be the ordinary individual user that’s going to drive the usage to problematic levels.

      If a company can make money off of it, consuming a ridiculous amount of energy to do it is just another cost on the P & L.

      (Assuming of course that the company using it either pays the electric bill, or pays a marked-up fee to some AI/cloud provider)

  • whoelectroplateuntil@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    9 months ago

    Counterpoint: I have a $5 RISC-V computer I can power off USB that has a dedicated AI chip capable of running basic image recognition/voice synthesis/time series prediction/whatever you want algorithms. This is what people need to think about when they think about the future of AI. All this shit with piping things into datacenters to throw away energy talking to the stochastic parrot is A - not even remotely going to be the main use for AI, B - not going to be necessary for tons of tasks (likely won’t be necessary to have ChatGPT-quality interactions within a couple years). ChatGPT is like the ENIAC of AI. Complaining about the energy use of AI and pointing to the energy use behind ChatGPT queries is the same kind of mistake as when Thomas Watson said in 1943 that there was, maybe, a world market for five computers. Yeah, there was maybe a world market for five ENIAC’s, tops, but get with the picture, people. AI’s much bigger than LLM’s.

  • Daxtron2@startrek.website
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    9 months ago

    The bigger companies focus on huge model sizes instead and ever increasing them. Lots of advanced are being made with smaller and more affordable models that can be run on consumer devices but the big companies don’t focus on that as it can’t generate as much profit.

    • Sonori@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      The problem is that all of the current discussion and hype is about Chat GPT and similar whole internet models. They are not as useful as more specialized small model ones, but they also not as easy to hype.

  • sacredmelon@slrpnk.net
    link
    fedilink
    arrow-up
    5
    arrow-down
    5
    ·
    9 months ago

    This is concerning, why they just dont stop the never ending updates and just stick with the latest things we have for a moment? Isnt all the tech stuff we have sufficient for the world to keep going?