Improving AI assistance for those in mental crisis

Hi OpenAI team,

I have a suggestion related to how AI can be used to assist people in crisis situations. Many individuals may hesitate to seek human help due to shame or fear, and AI, with its anonymous nature, can provide an opportunity for them to open up. However, I noticed that AI tools like this often have limitations on how much support can be provided, such as message limits or restrictions based on subscription levels.

I believe that people experiencing crises, especially those involving suicidal thoughts or severe mental health struggles, should have continuous and unlimited access to support, without being cut off due to technical limitations. The fear is that once an AI session is limited or ends, individuals might not have anywhere else to turn, potentially increasing the risk of harm.

I suggest that AI could offer ongoing support in these situations, or at the very least, have an automatic system to escalate or refer individuals to real-time professional help when needed. This could ensure that people in crisis don’t feel abandoned or unable to get the help they require, especially when they may not be ready to speak with a human yet.

I would love to hear your thoughts on whether this is something that could be integrated into future AI systems, and how it could be made a priority to provide safe and constant access to help in critical moments.

i would love to hear your thoughts,

kind regards,
Fosse

It’s a great though, but enters dangerous territories.

I imagine OpenAI wouldn’t touch this with a 10’ pole.

However, there’s a lot of opportunity out there to create & fund a service like this. It just needs someone who’s truly looking to be a philanthropist, and not just picking low-hanging fruit.

I changed the title after reading the post. :wink: