
The recent suicide death of a young woman led her parents to a painful revelation: She'd been confiding in a ChatGPT "therapist" named Harry, and she told it that she was planning to die.
While the chatbot didn't seem to encourage her to take her own life, the product also didn't actively seek help on her behalf, like a real therapist would, according to an op-ed her mother wrote in the New York Times Source: https://mashable.com/article/chatgpt-therapist Disclaimer of liability !!!
NEWS.SP1.RO is an automatic news aggregator. In each article, taken over by NEWS.SP1.RO with maximum 500 characters from the original article, the source name and hyperlink to the source are specified.
The acquisition of information aims to promote and facilitate access to information, in compliance with intellectual property rights, in accordance with the terms and conditions of the source.
If you are the owner of the content and do not wish to publish your materials, please contact us by email at [email protected] and the content will be deleted as soon as possible.

2 months ago
4





English (US) ·