API..."provide more information on the topic"

New here - searched but couldn’t find…

So using prompting and API in python, at the Terminal prompt. When I prompt “anymore?” or something similar it appears to break the wall and gets responses from others prompts. Is this how its it supposed to function? Can I restrict, like the website’s interface, where its bounds are only my session?

Hi and welcome to the Developer Forum!

Can you give a snippet of the API calling python you are using and the prompts used and the replies generated, please.

1 Like

I suspect you are misusing a completion model AI such as davinci, as might be suggested by old web links or the old knowledge of simply asking ChatGPT to code for you.

Completion will keep on writing where you left off. You likely want to use “chat completions” and the gpt-3.5-turbo model, programming and usage you can discover in API documentation on the forum’s sidebar.

davinci making silly text from just putting in a random word:

image

image

1 Like

Just the basic example…

import openai

Set your OpenAI API key

api_key = ‘***********************************’

Initialize the OpenAI API client

openai.api_key = api_key

Function to chat with the model

def chat_with_model(prompt):
response = openai.Completion.create(
engine=“text-davinci-002”, # You can choose a different engine if needed
prompt=prompt,
max_tokens=150, # Adjust the max_tokens based on your needs
temperature=0.7, # Adjust the temperature for creativity (higher values are more random)
)

return response.choices[0].text

Start a conversation

print(“Chat with the model. Type ‘exit’ to end the conversation.”)
while True:
user_input = input("ChatGPT: ")

if user_input.lower() == 'exit':
    break

# Provide the user input as a prompt to the model
response = chat_with_model(user_input)

# Display the model's response
print("Model:", response)

just an example…i asked nothing related beforehand and every time i ask, something completely unrelated popsup

except its providing code, and other seemingly relevant answers to very specific questions. Its soft of like in the 80s when lines crossed and you could listen to other people’s conversations on the phone.

Again. It’s because you are using text-davinci-002 without the correct style of prompt for the model.

This is how a chatbot reacts to your “word” when programmed correctly:

Put in all this language as your prompt and you’ll see that you get a better response:

Here is a conversation between a human and an advanced AI chatbot that is friendly and helpful.

AI: Hello, I am Bob, a friendly AI. How can I help?
Human: What’s the capital of Washington state?
AI: It is the city of Olympia. I can answer more questions if you like.
Human: What is a baby duck called?
AI: A baby duck is called a duckling.
Human: anymore?
AI:

This is a style of prompt that is called few-shot. It trains the older AI models how to respond before the final question, also giving a place for the AI to answer.

You will not want to learn how to use older completion models as a beginner, though. Learn the chat model endpoints and the formatted messages they accept.

You must also supply a chat history yourself each turn, so the AI will understand recent conversation.

1 Like

ahhhh that makes sense. Thank you for being patient with my n00bness.

2 Likes