• V0ldek@awful.systems
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 month ago

    That’s the fucking problem, it’s impossible to tell since MSFT won’t tell you directly, and only the people who run the datacenters could.

    The only relatively reliable numbers I was able to find were in this research paper by Luccioni and Strubell from ACM Conference on Fairness, Accountability, and Transparency 2024. Now, that’s an obscure conference (not even ranked by CORE), by Dr. Luccioni appears to be right on the money about dangers of AI (https://www.sashaluccioni.com/).

    • skillissuer
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 month ago

      they will tell total tho https://www.latitudemedia.com/news/microsoft-reveals-the-energy-impact-of-artificial-intelligence

      this works out to 2.7GW in 2023, on average. that’s comparable to peak daily consumption in croatia (today), if that 30%-ish figure is accurate then something closer to 700MW is ai-only, that’s smaller country like macedonia

      which only highlights how bizarre is their 5GW proposition. hey let’s outbuild ms 2x, like, now

      • skillissuer
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 month ago

        that sounds like it’s much less than crypto at its peak, and even 2023 estimate differs by over an order of magnitude (14.5GW avg). there’s also google and fb and whoever else (aws?)

    • skillissuer
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 month ago

      i started to look up satellite photos and openinframap in order to figure out maximum capacity of their substations, but powerlines for them are probably massively oversized, and substations are probably oversized too in order to make it redundant and high-availability so there might be some way to guess it but then some of these will be underground and if they’re doing load-following to match their renewables (which might be cheaper for them) then it’s also oversized a bit on top of that

      • V0ldek@awful.systems
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 month ago

        Well the main problem is that a datacenter is running much more than just AI. You’d need to somehow subtract “normal” cloud usage from just the promptfondling.

        • skillissuer
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          1 month ago

          ez. remember that announcement when ms said their energy use got up 36%? that’s ai, and includes both training and use

          this still can be fudged with more efficient office heating, shutdowns of least efficient dcs and so on, but only to a limited degree