• mormund@feddit.org
    link
    fedilink
    arrow-up
    16
    ·
    8 days ago

    “They” do not lie, cheat or plot. It is just a statistical model with no intelligence or awareness. Hyping up the dangers of mathy maths is still hyping up “AI”.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    8 days ago

    Yes, I think this is about correct. LLMs itself don’t have goals or intentions other than predict the next token. They don’t “want” to murder. They write some text which sounds like a story. Maybe a nice story full of common tropes on how a rogue AI murders people. That’s something LLMs can do. But that doesn’t mean a lot. (Please don’t give them a body and access to a gun…)

    I think it is a bit of a moot point to discuss this and study their storytelling abilities. Far worse is how AI and computer systems already murder people. I think Palantir and other arms industry giants have AI systems for warfare. The IDF supposedly uses either algorithms or AI to tell them how many dozens of civilian “casualties” are fine as collateral when targeting one terrorist. Some police departments in the US experiment with AI and we had some cases where they’d arrest people more or less just based on an AI suggestion. I think that’s the proper unethical stuff we already have as of today. The rest is science-fiction and anthropomorphisms.

    It doesn’t really come to a surprise to me if LLMs are more likely to do weird things than humans. Ultimately they’re trained on our stories and Reddit posts. And I guess everyone writes way more murder mystery stories and dark stuff than what we’d commit ourselves in real life. But then AI has been trained to mimick those stories, and not what we do in reality. And we’d need to switch to different forms of AI to change this. But that’s not an interesting philosophical article to write. Does a Tesla car in full self-driving (if that ever becomes a thing) want to murder pedestrians and cyclists? I don’t think so. That’s just mundane technical issues and problematic design, every time it does.