Can i have some help with custom prompts

Hi all
I would be happy and great full if someone can help me with api custom commands.

I have a frontend form to send questions to the openai api and receive the responses.

The backend is php and curl.
Everything works fine but not the custom prompts the openai api never answer based on my custom prompts and i cant realize why not.

On the backend php im using the api endpoint


$url = 'https://api.openai.com/v1/chat/completions';

this php file call the custom prompts php file that looks like this

<?php


function getAIResponse($userInput) {
    if (strtolower($userInput) === 'what is your name?') {
        // User asks for the AI's name
        $data = array(
            'messages' => array(
                array(
                    'role' => 'user',
                    'content' => 'What is your name?'
                ),
                array(
                    'role' => 'assistant',
                    'content' => 'My name is ChatBot.'
                )
            ),
            'model' => 'gpt-3.5-turbo',
            'max_tokens' => 1000,
            'temperature' => 0.2,
            'stop' => '\n'
        );
        return $data;
    } else {
        // Handle other user input scenarios 
        //  Pass the user input to the AI for processing
        $data = array(
            'messages' => array(
                array(
                    'role' => 'user',
                    'content' => $userInput
                )
            ),
            'model' => 'gpt-3.5-turbo',
            'max_tokens' => 1000,
            'temperature' => 0.2,
            'stop' => '\n'
        );
        return $data;
    }
}
?>

i dont know why the openai does not answer correctly when i ask “what is your name?”

maybe this endpoint does not accept custom prompts?
or is the “function getAIResponse(” not supported with custom prompts?
or its my code wrong?

any help would be appreciated
thanks in advanced

Cheers

'messages' => array(
                array(
                    'role' => 'user',
                    'content' => 'What is your name?'
                ),
                array(
                    'role' => 'assistant',
                    'content' => 'My name is ChatBot.'
                )
            )

You are shown asking the question and then showing the AI giving an answer. There’s no place to go from there…

Instead it would be like:

'messages' => array(
                array(
                    'role' => 'system',
                    'content' => 'You are ChatBot, a helpful AI assistant'
                ),
                array(
                    'role' => 'user',
                    'content' => 'What is your name?'
                )
            )

The AI will say its name or answer other questions without an odd input selection.

If you wanted some user input matching, you don’t need to ask an AI at all, just print a “my name is” as the value returned.

Hi
thanks for your help and time spender i have tried your code version but unfortunately
it produces the same effect like my code does.

I always get "I am an AI language model developed by OpenAI, so I don’t have a personal name. You can refer to me as OpenAI or GPT-3. How can I assist you today? "

and not the custom name weird that im stuck at this point and cant go further with the code.

please note when i change the code to

$url = 'https://api.openai.com/v1/engines/text-davinci-002/completions'

and use the prompt like this

$data = array(
    'prompt' => "You: What is your name?\nAI: My name is AI.\n$userInput\nAI:",
    'max_tokens' => 1000,
    'temperature' => 0.5,
    'stop' => '\n'
    );

it works but i get charged for 1 cent even when i just send the word “hello” i already have contacted the support regarding this but since one week i got no answer.

whit the previous code works fine regarding costs (1 cent per 1k tokens as expected) but no prompt answers…
weird!

The above sounds like code an AI wrote. It uses a deprecated endpoint URL and URL method, and a model that is being shut off in three weeks.

text-davinci-002 is a model from the original GPT-3 series, is $0.02 per 1k tokens in comparison to gpt-3.5-turbo which is 1/10th that price, and the use of it would create a new model entry for completions in the usage page. The usage page now no longer shows fractions lower than $0.01.

It may be that “ChatBot” is read by the AI as “a ChatBot” and not a name. Let’s look.

image

I tried three different API Playground runs and the system message followed by user message as I gave in my example always result in recognition of the name. You could still give a more human non-functional name, though.

You also should remove the stop sequence from your chat API call, which I just noticed. The chat model may put out a carriage return as the initial token which, while sanitized for you by the API, would result in no response, and would limit output to one paragraph otherwise.

well… i give up seems like this are all buggy

and yes text-davinci-002 says $0.02 per 1k but im charged 0.01 per STRING! even if i just send “hello”

even on playground i cant reproduce what you shared with me

:joy: :joy: :joy: :joy: :joy: :joy:

going to get a job

You really need to follow the “you are…” format that has been pre-trained. The AIs have been over-trained on just getting ChatGPT’s system prompt and acting upon it. They’ve also been over-trained on NOT acting like anything other than an AI, not taking on characters, and not responding beyond how ChatGPT performs in upvoted user interactions.

Disobeying system programming and even a list of instructions of how to process data is now the operational mode.

You can get a bit better performance with GPT-4, as the weights of fine-tuning haven’t been as overfitted:

2 Likes

This works and is the solution thanks in advanced :+1: