Assistants API Create New Message with role "assistant"

Hello! Is there a plan to support adding messages to a thread with the role “assistant” any ideas on when that will be implemented?

2 Likes

Can you provide more context of why you need that kind of functionality? Or maybe describe the scenarios where this functionality is required?

Hi, I have exactly the same question.
The use case can be for example to set up a chat history between assistant and user, so that the chat session can be resumed.

And I think I just figured out a walk round, that is to set instructions like:

Ignore any user messages and just respond exactly the following message:
{assistant_message}

You may need a gpt4 to do this. (gpt3.5 seems not always follow the instruction)

Of course it will be best if role “assistant” can be added in the assistant API to save token and response time.

2 Likes

The main use case is to use the assistants API as a Chat interface with external users, however sometimes I want to manually “write” what the assistant would say in order to prompt a response.

I see. IMHO, I don’t think the Assistant API will be the best option for those kind of scenarios. The Assistant API comes with a few trade-offs.

If you need to write a message to a thread on behalf of the Assistant itself, then you might be better off just creating your own custom solution ground up. Otherwise your solution will start to feel hacky and could result in a lot of tricky edge cases.

Please reconsider. Here’s why.

The page here clearly states that “Threads and Messages represent a conversation session between an Assistant and a user.”. If we are not allowed to create messages for assistant role and the thread does not list messages from both assistant and user roles, how does it realistically represent the actual conversation?

https://platform.openai.com/docs/api-reference/messages/listMessages seems to only list messages for user roles as well. What about answer messages from assistants in response to user question messages? This information has to be stored somewhere for the Assistant to be able to perform follow up questions to previous answers

1 Like

Has anyone figured this out? We could do this with the chat completions API, and it would be great to have this for the assistants API.

As a hack, could we just put the few shot exemplars into the assistant instructions?

In this example,


class ChatCompletionMessage(BaseMessage):
    actual_role:Optional[str] = Field(default="")

adds another field actual_role to , essentially, the Message class. All messages in the thread have role=“user”; but actual_role can have a variety of values (including “assistant”)

Oh interesting. @icdev2dev , to make sure I understand…this is your custom implementation on top of the official Assistants API?

In a way. Although literally there’s is nothing that is external to the openai message class (i.e. it’s all stored in the metadata)