const message = { role: “user”, content: question };
const response = await openai.createChatCompletion({
model: “gpt-3.5-turbo”,
temperature: 0.1,
messages: […conversationHistory, message],
});
As the message says, it doesn’t have knowledge of itself. It’s trained on content from before it existed. Trust your code and the documentation, not the chat response.
Thank you, Novaphil. Through your words and my search for the problem again, you let me know that my code is correct. This has been very helpful to me, and I have found that I only need to set the value of temperature higher, so answering my question will appear even more limited. The response will also be displayed as GPT-3.5-turbo.
I have another question about this. I wonder if you are willing to help me. I would like to know if using node. js to call the openai library can only use GPT-3? Its data collection deadline is May 2021, and to my knowledge, GPT-3.5 data is as of September 2021.
All the models are listed here. There is no literal “GPT 3” model in API. Closest is gpt-3.5-turbo
with Sept 2021 cutoff.