Then, to confirm, I entered gpt-3.5-turbo-16k into my API settings. And lo and behold!
So, in my particular use case, I found gpt-4 the best model, but far too expensive for daily use. I reluctantly fell back to using gpt-3.5-turbo which mostly worked, but kept giving me headaches because of the 4K context window.
Today, OpenAI solved all of those problems! Now, I’m sure other issues will arise as we continue on this journey, but today, I am a Happy OpenAI Camper!
These use cases are enabled by new API parameters in our /v1/chat/completions endpoint, functions and function_call , that allow developers to describe functions to the model via JSON Schema, and optionally ask it to call a specific function.
explains a ChatGPT prompt from earlier today which is noted in this topic.