Token use for updating instructions in an assistant

I’m trying to figure out if there is any, and if so, how much, token usage to update an assistants instructions via the api. I am not uploading any files.

As in, when I creat a thread, I want to append custom instructions to the assistant’s core instructions. This way each user has a slightly different experience with the same assistant. What I can’t figure out is if appending to the instructions uses any tokens or not.

relatedly, when I call a run, does the length of instructions in the assistant have any effect on how many tokens are used? Or does that just depend on the context and length of message being sent?


1 Like

There is no retrieval method for assessing tokens used. Costs are even obfuscated in the usage page of your account.

Assistants are loaded with tokens you didn’t write, loaded to the max with conversation and retrieval, and can make iterative calls to internal functions.

All language received and emitted by the AI model behind assistants framework uses billable tokens. Instructions are part of what is placed into the AI context.

A “run” of a thread can have a new instruction attached that replaces the assistant’s instruction. “Threads” does not allow any messages to be added to the conversation other than user role messages.

I hope that lets you determine if “assistants” is still an acceptable way for you to interact with OpenAI models.