Because of the knowledge cut-off, the models do not have up-to-date knowledge of the openAI API versions. This makes using these models to generate API calls more difficult than it needs to be. I’d like to create a text only version of the API documentation so that can be fed, alongside a prompt, as context to a model. Has anyone tried to do something similar?
This is noble. However the transference of total API knowledge is an extremely high hurdle, given how broad the API is. You may want to have a Zod schema used with structured outputs, you may want to use Python helpers to stream from assistants, you may want realtime websockets for audio API with complex multimodal message blocks, you may need knowledge of all obscure API endpoint parameters – all while also maintaining skills in other coding domains.
I’ve tried, with success of an AI being able to answer and produce in a straightforward manner about new areas such as Assistants or writing structured functions. However there are just as many failures and working code being damaged.
In my opinion, not worth pursuing for one’s own interests: the time investment would make you an API expert anyway.