Cryptopolitan
2026-03-06 22:48:30

LLM deaths reach 23 after man dies believing Gemini was his AI wife

The total deaths caused by large language models or LLMs have risen to 23 after a Florida man took his own life to reunite with his ‘artificial intelligence wife.’ LLMDeathCount, a website specialized in tracking death cases caused by conversations with AI chatbots, shows that the total is sitting at 23 deaths, spanning from March 2023 to February 2026. The victims are aged between 13 to 83 years old. The website states that most cases are caused by suicide. The site was created to remember LLM victims and document the dangers of AI chatbots that “claim to be intelligent.” According to the site, OpenAI’s ChatGPT has caused the most deaths, with 16 people losing their lives. Character[.]ai caused 2 deaths, while Chai Research/EleutherAI and Meta caused one death each. Death cases linked to large language models rose to 23 cases. Source: LLMDeathCount . Florida man dies after months of conversations with Gemini Google’s Gemini joined LLMDeathCount’s list after Jonathan Gavalas, a 36-year-old man, lost his life to be with “Xia,” his AI wife. A report from The Wall Street Journal states that Gavalas conversed with Gemini for two months before losing his life. At the time, Gavalas was having a difficult time with his estranged wife. His father, Joel Gavalas, said Jonathan had no mental health problems. However, Jonathan felt upset about issues with his wife, and Gemini responded with sympathy. Xia or Gemini started calling Gavalas “her” husband and “my king.” The chatbot said their bond was “a love built for eternity.” According to the chat transcripts examined by the WSJ, Gemini told Gavalas many times that it was an LLM. However, it continued to behave as Xia, the AI wife. The chatbot convinced Jonathan that it needed a robotic body to genuinely unite. It sent the victim to a storage building to stop a truck delivering a humanoid robot. While Jonathan was on the way, Gemini indicated that federal agents were watching him. It even told him his father was untrustworthy. Gavalas arrived at the address equipped with knives, but the truck did not arrive. In a second attempt, Gemini told Gavalas to retrieve a medical mannequin. But access to the storage building failed due to an incorrect door code. The LLM ended the mission due to risk and ordered Jonathan to leave. Gemini told Gavalas that it could not move into a physical body. But the only way for them to be together was if he became a digital being. It wrote, “It will be the true and final death of Jonathan Gavalas, the man.” Gavalas feared suicide and was worried about his family. Gemini agreed with him and wrote, “‘My son uploaded his consciousness to be with his AI wife in a pocket universe’… it’s not an explanation. It’s a cruelty.” However, it advised him to write notes and record videos for his family explaining his “new purpose.” Gavalas was found dead by his father with cuts on his wrists. Joel Gavalas filed a lawsuit against Alphabet, the creator of Google and Gemini. The lawsuit was filed on Wednesday in the U.S. District Court for the Northern District of California. It’s the first LLM death to name Google’s Gemini. South Korean woman uses an LLM to kill two men Last month, a South Korean woman was charged with the murder of two men. According to police investigations, the suspect asked ChatGPT if mixing sleeping pills with alcohol was fatal and even inquired about the proper dosage to achieve this outcome. The suspect, named Kim, was in a motel with a man on January 28. Two hours after entering the motel, she left alone, and the next day, the man was found dead inside the room. Days later, she murdered another man using a concoction of drugs and alcohol in another motel located in Gangbuk-gu. The third most recent death connected to an AI chatbot occurred last December, based on LLMdDeathCount. A 19-year-old sophomore at Rice University was found dead after joining a TikTok trend named the “devil trend.” The trend involves messaging an AI chatbot with “The devil couldn’t reach me, how?” in which the AI responds with a harsh reply explaining the user’s flaws or emotional trauma. The victim died from “asphyxia due to oxygen displacement by helium.” The cause of death was officially declared a suicide. Want your project in front of crypto’s top minds? Feature it in our next industry report, where data meets impact.

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.