She’s almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she’s anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I’m trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

  • @Mikina@programming.dev
    link
    fedilink
    English
    53
    edit-2
    1 year ago

    My personal opinion is that it’s one of the first large cases of misalignment in ML models. I’m 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

    And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

    I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I’m almost certain that the algorithms are to blame.

    • @Ludrol
      link
      English
      161 year ago

      If youtube “Algorithm” is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

      The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

      • @Mikina@programming.dev
        link
        fedilink
        English
        71 year ago

        I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

        But if you radicalize them into something that will make them seem like a nutjob, you don’t have to compete with their surroundings - the only place where they understand them is on the youtube.

    • archomrade [he/him]
      link
      fedilink
      English
      31 year ago

      100% they’re using ML, and 100% it found a strategy they didn’t anticipate

      The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

      I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

    • @MonkCanatella@sh.itjust.works
      link
      fedilink
      English
      21 year ago

      fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there’s the folks who will create any sort of content to game the algorithm and you’ve got a perfect trifecta of radicalization

      • @floofloof@lemmy.ca
        link
        fedilink
        English
        6
        edit-2
        1 year ago

        Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people’s lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

        • @MonkCanatella@sh.itjust.works
          link
          fedilink
          English
          21 year ago

          That’s interesting. That it’s almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.