openai.error.InvalidRequestError:

check this link from openai in how to calculate the tokens.

please note that
requests can use up to 4097 tokens shared between prompt and completion. If your prompt is 4000 tokens, your completion can be 97 tokens at most.

So in your case, “message” already exceeded 4097. message means:

const message = [
{ role: "system", content: your_system_prompt },
{ role: "user", content: "Hi" },
]

so perhaps your system prompt is too long. If not, then perhaps you are attaching previous conversations and they are already too long.

const message = [
{ role: "system", content: your_system_prompt },
// previous conversations here
{ role: "user", content: "Hi" },
]