Archive link

Silicon Valley has bet big on generative AI but it’s not totally clear whether that bet will pay off. A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Microsoft, which has bet big on the generative AI boom with billions invested in its partner OpenAI, has been losing money on one of its major AI platforms. Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports. The problem is that users pay $10 a month subscription fee for Copilot but, according to a source interviewed by the Journal, Microsoft lost an average of $20 per user during the first few months of this year. Some users cost the company an average loss of over $80 per month, the source told the paper.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high. A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

AI platforms are notoriously expensive to operate. Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint. At the same time, the infrastructure to run AI systems—like powerful, high-priced AI computer chips—can be quite expensive. The cloud capacity necessary to train algorithms and run AI systems, meanwhile, is also expanding at a frightening rate. All of this energy consumption also means that AI is about as environmentally unfriendly as you can get.

  • abhibeckert@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    9 months ago

    GitHub Copilot is extremely useful. It also runs pretty much on every key stroke and programmers make a lot of keystrokes throughout the day…

    It’s a useful enough tool that people would be willing to pay more, but at the same time it’s not using an advanced AI model. It uses an older (and now deprecated) model that I’m pretty sure a high end computer (even some laptop) could provide similar output without any cloud service, using open source / freely available models.

    My feeling is Copilot needs to either lower their price or improve the quality of the product if it’s going to survive. And I suspect they’re going to do the latter one.

    The other factor not discussed here is the hardware we use today for this task isn’t really designed for it. The GPUs most datacentres run AI models on are designed for graphics, not AI, and the algorithms mostly just need huge amounts of fast memory. I’m sure soon there will be dedicated hardware specifically designed for large language models with less compute cores and more memory. They’ll likely also run at lower clock speeds and use less power/generate less heat/etc.

    Just because companies are losing money right now doesn’t mean they will be in five years time.

    • upstream@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      I’m sure co-pilot will be revamped with the newer GPT-models, they’re just not prioritizing it right now.