I don’t see any settings being set. Try setting the temperature lower. Also, as @sps pointed out. You’ll need to provide an alternative response. If you prompt, the model HAS to reply with something. There is no real option to do nothing.
(Sorry if it wasn’t clear with - my point in the original example was meant to be illustrative. I was trying to make it so the chatbot only ever replies with “banana” so I can clear debug that the system prompt isn’t being listened to)
<OpenAIObject chat.completion id=chatcmpl-786pl-----------qvoc6SfA at 0x7f9681fb0680> JSON: {
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "Oh, I didn't recognize you without any makeup on!",
"role": "assistant"
}
}
],
"created": 1682167249,
"id": "chatcmpl-786pl------0BWl6qvoc6SfA",
"model": "gpt-4-0314",
"object": "chat.completion",
"usage": {
"completion_tokens": 12,
"prompt_tokens": 18,
"total_tokens": 30
}
}
To add, the reason why it seems like the system message is being ignored is … because it doesn’t exist. You are essentially overwriting it and the endpoint only sees the last user message. You could confirm this by printing your messages before sending it out.
It may be a good idea to initialize your messages array outside of the completion just for future debugging.