Reddit’s conversational AI product, Reddit Answers, suggested users who are interested in pain management try heroin and kratom, showing yet another extreme example of dangerous advice provided by a chatbot, even one that’s trained on Reddit’s highly coveted trove of user-generated data.

https://en.wikipedia.org/wiki/Bromism

However, a man was poisoned in 2025, after a suggestion of ChatGPT to replace sodium chloride in his diet with sodium bromide; sodium bromide is a safe replacement only for non-nutritional purposes, i.e., cleaning.[3][4][5]

  • foggy@lemmy.world
    link
    fedilink
    arrow-up
    31
    ·
    edit-2
    2 days ago

    Y’all ever read that thread of the guy getting addicted to heroin? Truly surreal.

    Just a bored guy decides to get something new from his dealer and posts about it on reddit. The next 2 years comments are a cautionary tale.

    /U/SpontaneousH for anyone morbidly curious.

    • brbposting@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 days ago

      Alternate link to somewhat prevent Google from interlinking us with you quite so tightly

      Original reddit link:

      https://old.reddit.com/r/BORUpdates/comments/16223aj/updatesaga_the_emotional_saga_of_spontaneoush_the/
      

      Automated summary: