• givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    13 days ago

    Direct source:

    https://blog.exolabs.net/day-4/

    But before getting excited I’d encourage you to look at the screenshot of the “sleepy Joe” story they had it write…

    It technically prints out words. And most of them are in plausible order, but it’s also largely gibberish.

    So it “works” just not useful

    • Jakule17@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      13 days ago

      It technically prints out words. And most of them are in plausible order, but it’s also largely gibberish.

      So just like a normal LLM

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        13 days ago

        So just like a normal LLM

        No, much worse…

        It’s worth looking at the screenshot, and I feel like their choice to show a picture of the screen and not text that could easily be copy/pasted was intentional

  • Dem Bosain@midwest.social
    link
    fedilink
    English
    arrow-up
    4
    ·
    13 days ago

    I have ChatterUI on my phone. Running a model locally is butter smooth, except for the first response after changing models. I haven’t done anything more than brief chats with it, so I don’t know if it’s more useful than that.