ChatGPT API - No line breaks from response

Hey, I am using cURL and I want to get line breaks from the response of the API answers. For example an answer with a bullet point list. How can I achieve this?
My code (using GuzzleHttp):

$client->request('POST', 'https://api.openai.com/v1/chat/completions', [
  'body' => '{"model": "gpt-3.5-turbo", "messages": [{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Give me 3 points why sport is good."}], "temperature": 0.5,
  "max_tokens": 500,
  "top_p": 1.0,
  "frequency_penalty": 0.0,
  "presence_penalty": 0.0}',
  'headers' => [
    'Content-Type' => 'application/json',
	'Authorization' => "Bearer XY"
  ],
]);

Ok solved it myself. The line breaks are actually there but got lost. Made them visible with this:

echo '<pre>' . $result . '</pre>';
echo str_replace("\n", '<br>', $result);
1 Like

Could you still undelete your post. It looked good and could be helpful for others. Cheers :slight_smile:

I can’t seem to undelete it. I posted it a few seconds after you found the answer yourself.

Here is the original post text:


The response will probably have line breaks (depends on the content you asked for). A good example is if you ask for a bullet list If you are displaying it in a web page, you may need to replace the

\n with a <br/>

1 Like

I have implemented response streaming, but not getting line breaks in the stream chunk. Can anyone help?

1 Like

In the ask-openai function, I replaced \n with a <br/>:

def ask_openai(messages):
    openai_config = OpenAIConfig.objects.first()
    temperature = openai_config.temperature if openai_config else 0.7
    model_name = openai_config.model_name if openai_config else 'gpt-4' 
    max_tokens = openai_config.max_tokens if openai_config else None
    top_p = openai_config.top_p if openai_config else 1.0
    frequency_penalty = openai_config.frequency_penalty if openai_config else 0.0
    presence_penalty = openai_config.presence_penalty if openai_config else 0.0


    response = openai.ChatCompletion.create(
        model=model_name,
        temperature=temperature,
        max_tokens=max_tokens,
        top_p=top_p,
        frequency_penalty=frequency_penalty,
        presence_penalty=presence_penalty,
        messages=messages,
    )
    answer = response.choices[0].message.get('content', '').strip()
    html_answer = answer.replace('\n', '<br/>')
    return html_answer

and it works fine,
Thanks!