Why does the ChatGPT API return empty results when using certain types of prompts in Python?

@ruby_coder why does it succeed but mine doesn’t? Do you have any idea?

Well, I’m a great coder, (hahaha) and wrote a test lab which does the entire OpenAI API so I can help folks here who have problems and need “real” help in a public OpenAI developer community (like you) :slight_smile:

First of all you started off with OpenAI API code written by ChatGPT, that was a mistake.

Second, you did not follow @dhiaeddine.khalfalla correct advice to print your entire response out so others can see what the entire response looks like (part of debugging).

Run your code again and do not use the print line you got from ChatGPT. Use this and post back what you get

print(response)

:slight_smile:

1 Like

I ran into this issue and never figured out the why. But it seems to do something with the question asked after the initial prompt.

In my case I get around this by checking for an empty response, and then have the code request to ask another way, that way the code doesn’t fail in place. Here is an example of how I am passing this with Javascript:

    var s = oJson.choices[0].text;
      // Empty Response Handling	     
        if (s == "") {
    	   txtOutput.value += "Eva: I'm sorry can you please ask me in another way?";
	    } else {
    	   txtOutput.value += "Eva: " + s.trim();
	      }
1 Like

@ruby_coder here’s the terminal:

{
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "logprobs": null,
      "text": ""
    }
  ],
  "created": created,
  "id": id,
  "model": "text-davinci-003",
  "object": "text_completion",
  "usage": {
    "prompt_tokens": 3384,
    "total_tokens": 3384
  }
}

Hi @lee19619

What was the stop param you sent to the API?

:slight_smile:

@ruby_coder I passed “####” to it as you said.

Please post the code you used.

Thanks

:slight_smile:

Here is my code:

response = openai.Completion.create(
    model="text-davinci-003",
    prompt=prompt,
    max_tokens=256,
    n=1,
    stop="####",
    temperature=0.0,
)

Good.

So why is the API saying you sent 3384 tokens to the API?

When I used the prompt text you shared, the prompt was like 100 or so tokens only.

101 tokens to be exact.

:slight_smile:

My category list is very long, about 700 categories. That’s why I typed “…” when I replied to you earlier.

OK. I am retiring from this topic @lee19619 because I asked you for the exact prompt you used and you sent a very tiny version to test without informing us that the text you sent was not the actual text (until just now when I spotted the huge token size difference).

Good luck @lee19619

Someone else can take it from here and help you out.

:slight_smile: