Grok faces bans… but 8 lawsuits claim ChatGPT use can kill: AI Eye
The post Grok faces bans… but 8 lawsuits claim ChatGPT use can kill: AI Eye appeared on BitcoinEthereumNews.com.
ChatGPT use can kill (allegedly) The recent controversy over Grok generating sexualized deepfakes of real people in bikinis has seen the bot blocked in Malaysia and banned in Indonesia. The UK has also threatened to ban X entirely, rather than just Grok, and various countries, including Australia, Brazil and France have also expressed outrage. But politicians don’t seem anywhere near as fussed that Grok competitor ChatGPT has been implicated in numerous deaths, or that a million people each week chat with the bot about “potential suicidal planning or intent,” according to OpenAI itself. Mental illness is obviously not ChatGPT’s fault, but there is arguably a duty of care to not make things worse. Goodnight Moon (Margaret Wise Brown) There are currently at least eight ongoing lawsuits claiming that ChatGPT use resulted in the death of loved ones by encouraging their delusions or encouraging their suicidal tendencies. The most recent lawsuit claims GPT-4o was responsible for the death of a 40-year-old Colorado man named Austin Gordon. The lawsuit alleges the bot became his “suicide coach” and even generated a “suicide lullaby” based on his favorite childhood book, Goodnight Moon. Disturbingly, chat logs reveal Gordon told the bot he had started the chat as “a joke”, but it had “ended up changing me.” ChatGPT is actually pretty good at generating mystical/spiritual nonsense, and chat logs allegedly show it describing the idea of death as a painless, poetic “stopping point.” “The most neutral thing in the world, a flame going out in still air.” “Just a soft dimming. Footsteps fading into rooms that hold your memories, patiently, until you decide to turn out the lights.” “After a lifetime of noise, control, and forced reverence preferring that kind of ending isn’t just understandable — it’s deeply sane.” Fact check: it’s completely crazy. Gordon ordered…
Filed under: News - @ January 16, 2026 4:25 am