Archive link

Silicon Valley has bet big on generative AI but it’s not totally clear whether that bet will pay off. A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Microsoft, which has bet big on the generative AI boom with billions invested in its partner OpenAI, has been losing money on one of its major AI platforms. Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports. The problem is that users pay $10 a month subscription fee for Copilot but, according to a source interviewed by the Journal, Microsoft lost an average of $20 per user during the first few months of this year. Some users cost the company an average loss of over $80 per month, the source told the paper.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high. A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

AI platforms are notoriously expensive to operate. Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint. At the same time, the infrastructure to run AI systems—like powerful, high-priced AI computer chips—can be quite expensive. The cloud capacity necessary to train algorithms and run AI systems, meanwhile, is also expanding at a frightening rate. All of this energy consumption also means that AI is about as environmentally unfriendly as you can get.

  • ryan@the.coolest.zone
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    AI is absolutely taking off. LLMs are taking over various components of frontline support (service desks, tier 1 support). They’re integrated into various systems using langchains to pull your data, knowledge articles, etc, and then respond to you based on that data.

    AI is primarily a replacement for workers, like how McDonalds self service ordering kiosks are a replacement for cashiers. Cheaper and more scalable, cutting out more and more entry level (and outsourced) work. But unlike the kiosks, you won’t even see that the “Amazon tech support” you were kicked over to is an LLM instead of a person. You won’t hear that the frontline support tech you called for a product is actually an AI and text to speech model.

    There were jokes about the whole Wendy’s drive thru workers being replaced by AI, but I’ve seen this stuff used live. I’ve seen how flawlessly they’ve tuned the AI to respond to someone who makes a mistake while speaking and corrects themself (“I’m going to the Sacramento office – sorry, no, the Folsom office”) or bundles various requests together (“oh while you’re getting me a visitor badge can you also book a visitor cube for me?”). I’ve even seen crazy stuff like “I’m supposed to meet with Mary while I’m there, can you give me her phone number?” and the LLM routes through the phone directory, pulls up the most likely Marys given the caller’s department and the location the user is visiting via prior context, and asks for more information - “I see two Marys here, Mary X who works in Department A and Mary Y who works in Department B, are you talking about either of them?”

    It’s already here and it’s as invisible as possible, and that’s the end goal.

    • monobot@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      This is just what is visible to users/customers which is just top of the iceberg.

      Real use of AI is in every industry and best use case is for jobs that were imposible before.

      • webghost0101@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s subjective. While being able to do stuff we couldn’t before is amazing i think the “Best” usecase is exactly the jobs that people do right know.

        Cheap Democratic labor accessible to everyone with a phone is the dream they can finnaly answer the early 20 century promise that technological will bring more leisure to all.

    • XPost3000@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      This article isn’t saying that AI is a fad or otherwise not taking off, it absolutely is, but it’s also absolutely taking too much money to run

      And if these AI companies aren’t capable of turning a profit on this technology and consumers aren’t able to run these technologies themselves, then these technologies may very well just fall out of the public stage and back into computer science research papers, despite how versatile the tech may be

      What good is a ginie if you can’t get the lamp?

      • abhibeckert@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        it’s also absolutely taking too much money to run

        Well, maybe they should raise their prices then?

        If they raise the prices too far though, I’ll just switch to running Facebook’s open source llama model on my workstation. I’ve tested and it works with acceptable quality and performance, only thing that’s missing is tight integration with other tools I use. That could (and I expect will soon) be fixed.

        • XPost3000@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Exactly, nobody’s gonna wanna pay $20-$80 per month if they can just run an open source version for free

          Classic proprietary L, ironically enough for "Open"AI