API Text Value adding returns

Hi,

When receiving a result from the API is it possible to stop it adding 2 new lines to the text response?

As with the below, the text results always contain \n\n at the start which I would like it to stop if possible?

"choices": [
        {
            "text": "\n\nChristmas is celebrated annually on December 25.",
            "index": 0,
            "logprobs": null,
            "finish_reason": "stop"
        }

What model and settings?

Might be as easy to strip them out with regular expressions after you get it back?

PS - welcome to the community! Hope you stick around!

Hi Paul, thanks for the response, this is what my settings looks like:

{
  "model": "text-davinci-003",
  "prompt": "<prompt>",
  "temperature": 0,
  "max_tokens": 2048,
  "top_p": 1.0,
  "frequency_penalty": 0.0,
  "presence_penalty": 0.0
}

Weird. It adds them for every single request, or just most of them?

If you share the prompt, I might be able to glean some more information on why it’s doing it. Do you have any extra spaces or anything at the end of the prompt?

The prompt would have been:

{
  "model": "text-davinci-003",
  "prompt": "christmas when",
  "temperature": 0,
  "max_tokens": 2048,
  "top_p": 1.0,
  "frequency_penalty": 0.0,
  "presence_penalty": 0.0
}

It’s happening with every result I get.

I don’t see anything right away that could be causing it. Could just be how text-davinci-003 sends it back…

Is it not possible to strip it out after you get it back?

Maybe try adding a space after your prompt… maybe the prompt is too short for it to continue and it does two line breaks? I dunno…

Curious, tho!

1 Like

Thanks a lot of for your help! Adding double return after prompt removed the them from being in the text result.

{
  "model": "text-davinci-003",
  "prompt": "<prompt> \n \n",
  "temperature": 0,
  "max_tokens": 2048,
  "top_p": 1.0,
  "frequency_penalty": 0.0,
  "presence_penalty": 0.0
}
 {
            "text": "Christmas is an annual festival commemorating the birth of Jesus Christ, observed primarily on December 25 as a religious and cultural celebration among billions of people around the world.",
            "index": 0,
            "logprobs": null,
            "finish_reason": "stop"
        }

There’s more than one way to skin a cat! :wink:

Glad it worked out.

1 Like

Came here to say that I was experiencing the exact same issue using text-davinci-003. Adding \n \n at the end of my prompt resolved the issue.

Did you try prompting the API to not add new lines to the beginning of the response?

Paul thanks for this guidance, I was experiencing a different but similar issue. I was truncating the text of my question to keep it below 3000 characters, and text-davinci-003 always wanted to complete the sentence before giving me the answer. Adding the \n \n got it to realize that the question was over and just gives me the answer. Excellent

1 Like