“Will I be OK?” Teen died after ChatGPT pushed deadly mix of drugs, lawsuit says
Mirrored from Ars Technica — AI for archival readability. Support the source by reading on the original site.
OpenAI is facing down another wrongful-death lawsuit after ChatGPT told a 19-year-old, Sam Nelson, to take a lethal mix of Kratom and Xanax.
According to a complaint filed on behalf of Nelson's parents, Leila Turner-Scott and Angus Scott, Nelson trusted ChatGPT as a tool to "safely" experiment with drugs after using the chatbot for years as a go-to search engine when he was in high school.
The teen viewed ChatGPT so highly as an authoritative source of information that he once swore to his mom that ChatGPT had access to "everything on the Internet," so it "had to be right," when she questioned if the chatbot was always reliable, the complaint said.
More from Ars Technica — AI
-
AI invades Princeton, where 30% of students cheat—but peers won't snitch
May 13
-
Altman forced to confront claims at OpenAI trial that he's a prolific liar
May 13
-
Anthropic blames dystopian sci-fi for training AI models to act “evil”
May 13
-
Rivian adds a new onboard AI assistant to its latest software update
May 13
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.