Seven years later, Kyle’s argument is that AirSpace has turned into what he now calls Filterworld, a phrase he uses to describe how algorithmic recommendations have become one of the most dominating forces in culture, and as a result, have pushed society to converge on a kind of soulless sameness in its tastes.

  • souperk@reddthat.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    7 months ago

    IMO it’s never about the tool, but who controls it. For example, nuclear energy is a neutral thing on its own, when used to generate power it’s (arguably) a net positive, when used for bombing it’s a net negative.

    The same goes for algorithms, when they are used to save lives at hospitals it’s a net positive, when used to harvest people’s attention it becomes a net negative.

    (For anyone interested, I have MAB algorithms in mind, they can be used to prioritize patients at hospitals, or make recommendations in social media. You can guess which application of the algorithm is more commonly used, well researched, and well funded.)

    • Robert Rothenberg@floss.social
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      @souperk @pelespirit

      > For example, nuclear energy is a neutral thing on its own, when used to generate power it’s (arguably) a net positive…

      It’s more complicated than that.

      Mining uranium has side effects, usually for poorer communities.

      The fuel has to handled safety, as well a the waste which to be safely stored for 1000s of years.

      Nuclear plants have to be designed and built well.

      The most benign democracies have made made a mess of those issues.

      1/n

      • Robert Rothenberg@floss.social
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        @souperk @pelespirit

        > The same goes for algorithms, when they are used to save lives at hospitals it’s a net positive

        Again, more complicated.

        Are the algorithms mathematically sound, or just AI/machine learning magic fairy dust?

        Do the algorithms have implicit biases against poor people, or those with darker skin or who live in certain postcodes?

        2/n