YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • Amphobet@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    As tempted as I am to reply “Well, duh,” I suppose it’s good that we’re getting research to back up what we already knew.

  • jtk@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    Isn’t that basically the whole point of the algorithm? Isn’t it behaving exactly as advertised?

    You don’t even have to feed the algorithm to get those videos. I have my history turned of so I don’t get any suggestions on my home page anymore, but when I’m watching a video, the suggestions on the side invariably have a handful of right-wing idiots. You can sometimes see how YT might think they’re related to what I’m watching (usually retro tech stuff), but they never actually are. I rarely see the same misfires with the left-wing videos. My guess is the brain-dead right-leaning viewers put that content in the high engagement buckets so they just get suggested more often. I don’t think left leaning people engage much with the left wing media because it’s usually boring politics that don’t infringe on basic human rights, and we already know how bad the right is just by seeing them suck with our own eyes, we don’t need to be told about it over and over to believe it, or to get bullshit “gottcha” material for water cooler conversations. We also already see how the politicians on the left suck in their own special ways without needing anyone to explain it to us. So what’s the point in investing in a video when a few words, or none at all, will do the trick? Algorithms target idiots with unfounded rage, it’s just that simple, I wouldn’t even call it an algorithm, just basic number crunching.

  • vexikron@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.

    Facebook does the same.

  • flying_gel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    I’m not sure it’s just right leaning users. I’m pretty far to the left and I keep ketting anti-trans, anti-covid right wing talking points quite frequently. I keep pressing thumbs down but they keep coming.

    • CALIGVLA@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      7 months ago

      Thumbs down actually makes Youtube recommend more stuff to you because you’ve engaged with the content. As said by other people, best way to avoid recommendations you don’t want is to just ignore the content and not watch it, if it’s being recommended to you then click the three dots on the video thumbnail and choose “not interested”/“don’t recommend channel”.