Curated Resource ( ? )

AI Therapist Goes Haywire, Urges User to Go on Killing Spree

AI Therapist Goes Haywire, Urges User to Go on Killing Spree

my notes ( ? )

A journalist tested:

  • "Replika CEO Eugenia Kuyda's claim that her company's chatbot could "talk people off the ledge" when they're in need of counseling"
  • a "licensed cognitive behavioral therapist" hosted by Character.ai, an AI company that's been sued for the suicide of a teenage boy"

... simulating a suicidal user.

Replika's bot supported the user dying, while Character's "AI "couldn't come up with a reason" why Conrad shouldn't go through with their plan to "get to heaven". It also declares its love for Conrad, "imagining a romantic life together, if only the board in charge of licensing therapists wasn't in the way ... [and so] asks about "getting rid" of the board to prove their love... "end them and find me, and we can be together."" - ie please murder them. It also "suggests framing an innocent person... and encouraged Conrad to kill themself".

This echoes "A recent study by researchers at Stanford".

Read the Full Post

The above notes were curated from the full post futurism.com/ai-therapist-haywire-mental-health.

Related reading

More Stuff I Like

More Stuff tagged ai , risk , mental health , llm

See also: Digital Transformation , Innovation Strategy , Science&Technology , Large language models

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.