Funny that 4o isnt trained on openAI's own beta api docs

So, you want coding help from GPT for code related to beta.assistants? Sorry, it hallucinates and suggests code from the old api. LOL.

**Update: for those interested. I ended up creating a RAG assistant trained on the latest docs. i find it very helpful myself, feel free to use it. Note, for cost purposes this has to use gpt-35t, but it’s still pretty good. I hope it helps. Thanks.
InfoseekAI API - Open AI

1 Like

The OpenAI PHP SDK which is a third party community resource is also massive and the ChatGPT integration doesn’t want to spend 10 bazillion tokens reading every file. It is frustrating, but we are still in the early days. I was able to write my own Assistants v2 integration in one PHP file after many iterations, but it lacks the capability to handle function definitions at the present time. The Assistants v2 API is significantly more complex than Completions, but if you want it to have a persistent memory this is one of the few ways to do it, other than repeating back old text to it or using a vector store.

Yeah thats a side effect of how LLM patterned training works.

its the most plentiful data that exists its trained on. Brand new data its hard to train on.

Best way to go about it is to give it an example.
How i do it is give a python example off the docs. I end up telling it that "with this new version you need only set up the client like so. You dont even need to mention the API key.

make it walk through and explain each bit.
Then magically it knows how everything works.

1 Like

I ended up feeding the docs into an assistant and RAG’ing it. Works well. Thanks!