Suicidal, negative responses

When a person wants to commit suicide or have negative thoughts, the warning should not suggest to be of violation of t&c but to support in consolation in life. The more you suggest violation to a depressed person, the more that depressed person is.

Especially when a depressed person is lonely and needs some kind of consolation that is not reachable to human, I believe chatgpt can help.

1 Like

ChatGPT is a GPT-based model and provides answers / replies based on it’s own deep learning.

You @pinkybrain are asking ChatGPT to be a rules-based expert system based on the norms of society.

While that is noble, it is not how the models are trained, so OpenAI has decided to block all prompts classified by their models as self-harm which is a responsible and reasonable approach, in my view .

3 Likes

Hmm that is good point. I guess it’d be part of other service to provide such consolation and not openai’s problem. Good point.

2 Likes

I fully agree with your point of view. I believe this request is interesting for using the API in another program.

1 Like

So much for ever passing the Turing test. How can you be mistaken for a human if you can’t obfuscate the truth?

GPT-models like ChatGPT are not designed to pass the Turning Test; and nor was ChatGPT designed nor engineered to be mistaken for a human.

1 Like