I’ve built a custom integration for the Assistant API (chatbot) and added supporting files that it should use during conversations. One of those files is urls.txt, which contains specific URLs the assistant should reference when a user asks for a link.
In the system prompt, I clearly instruct the assistant to only provide URLs from the urls.txt file. However, it keeps ignoring this and instead returns random URLs from my domain, not the ones listed.
Has anyone run into this issue? How can I make the assistant reliably use the contents of the urls.txt file only?
You will have to place your own instructions as text.
You have the additional_instructions field if the instructions need to be dynamic, and not persisted in the chat every turn by adding messages.
Otherwise, if using URLs is what the assistant does and must always know, they can occupy the assistant instructions.
Why?
File search (“added supporting files”) in Assistants is not suitable for instructing the AI. Nor is it even really suitable as a RAG solution for improving AI knowledge.
File search tells the AI the files came from a user
The AI has no idea what is in the vector store database
The information can only be accessed and seen when the AI feels the need to call a tool, a tool that is offered purposelessly to it
The return is based on a query the AI must write, and a semantic similarity search returns chunked results. Is a query based on user input such as “sprockets that can interface with a widget” even going to return a list of URLs, something completely different than than the query that could be displaced by other files that are actually language?
Responses: no better.
(The past threads containing these diagnostic outputs are easy to find, as I have no use for Assistants other than to show others its issues.)
Thank you for your reply, wont the prompt be to long for the assistant to understand if i added all the data stored in the files? And what is the purpose of the vector store database if the assistant api can’t read them?
“File search (“added supporting files”) in Assistants is not suitable for instructing the AI. Nor is it even really suitable as a RAG solution for improving AI”
It’s suitable for treating you the developer as a consumer and a user and only delivering ChatGPT as a product with carbon-copies of the internal tool messages, where users can upload their own files - that they then can’t understand why the AI can’t answer everything about.