RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 3 days agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square107fedilinkarrow-up1638arrow-down19cross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldaboringdystopia@mander.xyztechnology@lemmit.onlinetechnology@hexbear.nettechnology@hexbear.net
arrow-up1629arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 3 days agomessage-square107fedilinkcross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldaboringdystopia@mander.xyztechnology@lemmit.onlinetechnology@hexbear.nettechnology@hexbear.net
minus-squareJoe@lemmy.worldlinkfedilinkEnglisharrow-up10·2 days agoIt certainly should be designed for those type of queries though. At least, avoid discussing it. Wouldn’t ChatGPT be liable if someone planned a terror attack with it?
It certainly should be designed for those type of queries though. At least, avoid discussing it.
Wouldn’t ChatGPT be liable if someone planned a terror attack with it?