• atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.

    As the saying goes “I’m sure there’s someone out there who believes the exact opposite of everything I believe, and while I’m sure they aren’t a complete idiot…”

    Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it’s not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn’t offer that. They keep trying to add it in but it’s basically antithetical to the design.

    When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.

    I don’t think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don’t think we humans are all going to be able to fight the temptation of an automated propaganda machine.

    This is especially because mental health and healthcare in this country has been failing for decades, and even people who “don’t have mental health problems” aren’t magically mentally healthy, they just don’t know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don’t know about.

    • Kuma@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      Sounds to me like it’s mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn’t true or the chance is extremely low, like trying to win the lottery.

      I’ve never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that’s why it’s hard for me to understand (because I haven’t been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn’t care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like “you’ve got good instincts!”, doesn’t challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren’t trying to understand and just give me empty replies, like a salesperson reading from a script.

      So that’s mostly why it’s hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don’t understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.