ChatCompletions Params Beside Temperature

Is there any evidence that the ChatCompletions API responds to parameters other than temperature, like top_p, frequency_penalty, and presence_penalty?

My efforts to alter these through my ChatCompletions requests don’t seem to have any effect. And nothing conclusive is returned in the response object.

Bonus points if you cite your sources, give working code, or provide conclusive evidence.

Okay, looking at the network tab from OpenAI Playground, I can look at the POST payload:

Request URL: https://api.openai.com/v1/chat/completions
Request Method: POST

{
  "model": "ft:gpt-3.5-turbo-0613:artist::8FaIcau5",
  "messages": [
    {"role": "user", "content": "Hello"}
  ],
  "temperature": 0.6,
  "frequency_penalty": 0.36,
  "max_tokens": 256,
  "presence_penalty": 0,
  "stream": true,
  "top_p": 1
}

So it looks like all the parameters are passed to the ChatCompletions API from the Playground. That’s good to see and know.