I was having this exact same problem and I’ve just fixed it. I know this is really strange… but I fixed it by changing the temperature parameter in the request to 0. I’m not sure how or why this fixed it, but once I changed it, I no longer get 400 bad request. Check your parameters and tweak them and see if the 400 goes away.
I’m also experiencing this issue. I have a very simple pass-through API endpoint in NextJS using the OpenAPI Node package. I’ve double-checked my keys, billing, etc. However, adding max_tokens=1024 to it in any position causes an error. Adding temperature and other parameters didn’t fix it. I would include images showing the example, but I can only upload one image as a new user.
However, I’m able to get a cURL to work just fine, which leads me to think this may be an issue in the node package.
I confirmed this by testing this request:
export default async function handler(
req: NextApiRequest,
res: NextApiResponse<OpenAPICompletion>
) {
const requestOptions = {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: "Bearer OMITTED",
},
body: JSON.stringify({
model: "text-davinci-003",
prompt: "Please provide 1024 tokens of test data",
max_tokens: 1024,
}),
};
await fetch("https://api.openai.com/v1/completions", requestOptions)
.then(async (response) => {
res.send(await response.json());
})
.catch((error) => {
console.error("Error on request:", error);
res.send(error);
});
// console.log("req.query", req.query);
// try {
// const response = await openai.createCompletion(req.query);
// res.send(response.data as CreateCompletionResponse);
// } catch (e) {
// console.log("Error on request", e);
// res.send(e);
// }
}
Which returns the expected result:
Yes, as I mentioned at the top, the temperatures, etc. that you put in the environment variables need to be converted to numbers, and in my case, that fixed it.
