I have a use case where I’m consuming the OpenAI API and working with the gpt-3.5-turbo-16k model. The objective is to respond to specific questions using a dataset that is loaded beforehand. However, it seems like the model doesn’t always understand the prompt, as it either states that it doesn’t have recent information or that the dataset doesn’t contain the requested information. How can I generate a prompt that consistently prompts the gpt-3.5-turbo-16k model to query the information about certain topics?
Thanks Community!