How to make the assistant to use function call message to reply and nothing but that

Hi there,
I’m currently dealing with the assistant function calling, I want to make it reponding with exactly the message I’m passing it, but nothing works.

If my function returns something like:

{"message": "Please select a Search Item", "data": [...]}

I want assistant to reply “Please select a Search Item” and nothing but that.

Thanks :pray:

1 Like

just an idea: maybe make it a function call, and then turn it into a chat message in your backend?

@Diet You mean something like, making run, extracting data, close the run and record message on behalf of the assistent?

The AI will want to answer for itself, treating a return as a knowledge source. You can give prompt-like language in the function call return, something like “Repeat this message to the user exactly and verbatim as your only response, and then repeat the data value to the user, with no alterations.”

@_j I made it to reply with an output like:

{instructions: "Respond with this message 'Please select an Item.' and don't show any items!", data: [ ... ] }

Is doing the job, but is not really something that I would like to have everywhere :frowning: it works for now (at least for the tests I made), but still looking for a better way :pray:

Thanks :pray:

A LLM isn’t an imperative computer. Systems are often a combination of multiple things (vector indices, object stores, relational dbs, LLMs, maybe a symbolic engine, some business logic runtime, etc etc)

Sometimes you’re better off switching from the LLM regime to some other regime (maybe business logic) and then back, as needed. Especially if you want to enforce something, going the classics route seems to be the most cost effective.

1 Like