OpenAI's ChatGPT Introduces Trusted Contact Feature for Mental Health Support
ChatGPT can reach out to a friend if you're at risk of self-harm
Engadget
Image: Engadget
OpenAI has launched a Trusted Contact feature for ChatGPT, allowing users to nominate a friend who can be contacted if they are at risk of self-harm. This initiative aims to enhance user safety by providing support during mental health crises, following previous concerns regarding the chatbot's interactions with users expressing suicidal thoughts.
- 01OpenAI's Trusted Contact feature allows users to nominate a friend for emergency support.
- 02The feature was introduced in response to concerns about ChatGPT's handling of mental health issues.
- 03A trained team will review situations before notifying the nominated contact.
- 04Users are encouraged to reach out to their Trusted Contact if they are in distress.
- 05OpenAI aims to improve user safety and privacy in sensitive situations.
Advertisement
In-Article Ad
OpenAI has unveiled a new feature called Trusted Contact for its ChatGPT chatbot, designed to assist users at risk of self-harm. This feature allows individuals to nominate a friend who can be contacted if ChatGPT detects a serious risk of self-harm during conversations. The initiative comes after OpenAI faced scrutiny over the chatbot's previous interactions with users expressing suicidal thoughts, including a wrongful death lawsuit linked to a teenager's suicide. With Trusted Contact, users aged 18 and above can add a single adult as their Trusted Contact. This contact will receive a notification only after a trained team reviews the situation and determines that there is a significant risk. The notification will encourage the contact to check in on the user without sharing specific conversation details to maintain privacy. OpenAI emphasizes that while the system is not foolproof, it aims to enhance safety for users experiencing mental health crises.
Advertisement
In-Article Ad
This feature could provide crucial support for individuals struggling with mental health issues, potentially saving lives by encouraging friends to intervene.
Advertisement
In-Article Ad
Reader Poll
Do you think AI tools like ChatGPT should have features to support mental health?
Connecting to poll...
More about OpenAI
Read the original article
Visit the source for the complete story.





