I tried to watched the video last night but it was very hard-hitting as someone with depression and who’s occasionally suicidal, I had to stop around 10 minutes as it was getting too much that ChatGPT kept encouraging people to be self-destructive.

  • Sterling@lemmy.zip
    link
    fedilink
    arrow-up
    20
    ·
    2 days ago

    Couldn’t imagine losing a family member because some half-baked chatbot told them to off themselves. I know it won’t bring them much comfort but here’s hoping their lawsuits are successful.

    • HaraldvonBlauzahn@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      14 hours ago

      Maybe it is a good act of solidarity to support the lawsuits. The kind of solidarity as in “you are fighting for us that these things stop to happen, so we stand behind you.”

    • Sanctus@anarchist.nexus
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      2 days ago

      Dying from mental illnesses is not a fucking darwin award, dont be so callous. These people were already vulnerable, its not like they were neurotypical before ChatGPT.

    • Maven (famous)@piefed.zip
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 days ago

      I wasn’t expecting a mild dose of eugenics in the comment section when I decided to read them

    • glimse@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Are you like 12 or are you an emotionally unintelligent adult?

      As if depressed people always stay depressed forever.

    • MonsterTrick@piefed.worldOP
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      I don’t think the people who are able to be convinced to kill themselves by an AI chatbot are the best kind of people to carry on the future of the human race.

      That’s cold. One thing I do want to say is that (which is often more at online spaces) that people suggest using ChatGPT as a therapist which is both concerning and wreckless. As well as if ChatGPT have implement some form of safeguarding, I think ChatGPT could of prevented many suicides but they don’t care. It’s easy to plan to take your life if you have someone encourage it and you already at a worse headspace.

      So I don’t blame the victims for using it and feeling like it had helped them despite it did the total opposite.