Connecting to an existing assistant

@analiticfindchia - Welcome to the community.

For detailed guide by OpenAI click here.

STEP 1 :

Retrieve the assistant id once created using.

assistant = self.client.beta.assistants.create(
                    name=name,
                    instructions=instructions,
                    model="gpt-35-turbo",
                    temperature= temperature,  
                    top_p= top_p,  
                    tools=tools
                )
print(assistant.id) #Prints the assistant id created.

STEP 2:
Create a new thread

thread = client.beta.threads.create()

STEP 3:
Add messages to the thread using this.

message = client.beta.threads.messages.create(
            thread_id= thread.id, role="user", content="hi!"
        ) #change content to the message you receive from the api

STEP 4:
This message on the thread needs to be passed to the LLM for it to generate output. You can use streaming for a typewriter effect or without streaming. Here is a sample code for without streaming.

Note : Handle different run status, I just did two for demonstration. you can find run status desc here

run = self.client.beta.threads.runs.create_and_poll(
            thread_id=self.thread_id, assistant_id=self.assistant_id
        )
tool_outputs = []


# If the run requires action, process the tools
        if run.status == "requires_action":
            try:
                # Iterate through the tools and execute them
                for tool in run.required_action.submit_tool_outputs.tool_calls:
                    args = json.loads(
                        run.required_action.submit_tool_outputs.tool_calls[0].function.arguments
                    )
                    # If the tool has a name, execute it
                    if tool.function.name:
                        print("Invoking tool:", tool.function.name)
                        function_name = tool.function.name
                        """
                         Call your tools here  and save op
                        """
                        # Save the tool output
                        tool_outputs.append(
                            {
                                "tool_call_id": tool.id,
                                "output": json.dumps(op),
                            }
                        )                

# If there are tool outputs, submit them
                if tool_outputs:
                    try:
                        # Submit the tool outputs
                        run = self.client.beta.threads.runs.submit_tool_outputs_and_poll(
                            thread_id=self.thread_id,
                            run_id=run.id,
                            tool_outputs=tool_outputs,
                        )

                    except Exception as e:
                        print("Failed to submit tool outputs:", e)
                else:
                    print("No tool outputs to submit.")

        # If the run is completed, return the messages
        if run.status == "completed":
            # Get the messages from the thread
            messages = self.client.beta.threads.messages.list(
                thread_id=self.thread_id
            )
            return [messages.data[0].content[0].text.value]

STEP 5:

Pass this message back as a response from the api as assistant message.

Cheers :smiley:!

1 Like