@ruby_coder why does it succeed but mine doesn’t? Do you have any idea?
Well, I’m a great coder, (hahaha) and wrote a test lab which does the entire OpenAI API so I can help folks here who have problems and need “real” help in a public OpenAI developer community (like you)
First of all you started off with OpenAI API code written by ChatGPT, that was a mistake.
Second, you did not follow @dhiaeddine.khalfalla correct advice to print your entire response out so others can see what the entire response looks like (part of debugging).
Run your code again and do not use the print
line you got from ChatGPT. Use this and post back what you get
print(response)
I ran into this issue and never figured out the why. But it seems to do something with the question asked after the initial prompt.
In my case I get around this by checking for an empty response, and then have the code request to ask another way, that way the code doesn’t fail in place. Here is an example of how I am passing this with Javascript:
var s = oJson.choices[0].text;
// Empty Response Handling
if (s == "") {
txtOutput.value += "Eva: I'm sorry can you please ask me in another way?";
} else {
txtOutput.value += "Eva: " + s.trim();
}
@ruby_coder here’s the terminal:
{
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"text": ""
}
],
"created": created,
"id": id,
"model": "text-davinci-003",
"object": "text_completion",
"usage": {
"prompt_tokens": 3384,
"total_tokens": 3384
}
}
Please post the code you used.
Thanks
Here is my code:
response = openai.Completion.create(
model="text-davinci-003",
prompt=prompt,
max_tokens=256,
n=1,
stop="####",
temperature=0.0,
)
Good.
So why is the API saying you sent 3384 tokens to the API?
When I used the prompt text you shared, the prompt was like 100 or so tokens only.
101 tokens to be exact.
My category list is very long, about 700 categories. That’s why I typed “…” when I replied to you earlier.
OK. I am retiring from this topic @lee19619 because I asked you for the exact prompt you used and you sent a very tiny version to test without informing us that the text you sent was not the actual text (until just now when I spotted the huge token size difference).
Good luck @lee19619
Someone else can take it from here and help you out.