I noticed that with the introduction of caching, even when a function is removed from the tools, the system continues to manage it. Would it be possible to apply hashing to the tools section as well? It could happen that, in ongoing conversations with the same context, it becomes necessary to remove a specific function.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Is this a problem with cached tokens? | 3 | 773 | October 10, 2024 | |
Is there a way to disable prompt caching in the APIs | 8 | 2910 | October 22, 2024 | |
Forcing assistants to pull fresh data from tools every time | 2 | 96 | September 20, 2024 | |
Caching system prompt to facilitate interaction between user and llm | 3 | 1999 | September 19, 2024 | |
How does Prompt Caching work? | 8 | 1778 | October 29, 2024 |