Call Assistant API builed in UI

Hi,

How can I call an assistant I created on the OpenAI website using the assistant ID (asst_???) in retrieval mode? Can anyone share an example of the code?

Thanks.

You will need your Assistant’s id e.g. asst_58fDQ7d8AJHDJy8dv.

From the doc, jump to Step 2, Create a thread

// create thread
const thread = client.beta.threads.create()

// add your message to the thread
const message = client.beta.threads.messages.create(
    thread_id=thread.id,
    role="user",
    content="I need to solve the equation `3x + 11 = 14`. Can you help me?"
)
// create run
const run = client.beta.threads.runs.create(
  thread_id=thread.id,
  assistant_id="asst_58fDQ7d8AJHDJy8dv", // <-- your Assistant id
  additional_instructions ="Please address the user as Jane Doe. The user has a premium account."
)

// do polling
let isCompleted = false
do {

const check_run = client.beta.threads.runs.retrieve(
  thread_id=thread.id,
  run_id=run.id
)

if(check_run.status === "completed") {
         // get the messages
         const list_messages = client.beta.threads.messages.list(thread_id=thread.id)
         // get the new messages
         isCompleted = true
} else if(check_run.status === "requires_action") {
        // handle function calling
} else {
        // handle other status
}

if(!isCompleted) {
await sleep(1000)
}

} while(!isCompleted)

2 Likes

Thank you very much. It worked. I am using python. I have another question. I have the following code:

import os
from openai import OpenAI
client = OpenAI(api_key='sk-???')

# Step 3: Create a Thread
my_thread = client.beta.threads.create()
print(f"This is the thread object: {my_thread} \n")

# Step 4: Add a Message to a Thread
my_thread_message = client.beta.threads.messages.create(
  thread_id=my_thread.id,
  role="user",
  content="What do you think is the percent chance that 12 months from now the average interest rate on saving accounts will be higher than it is now?",
  file_ids=[my_file.id]
)
print(f"This is the message object: {my_thread_message} \n")

# Step 5: Run the Assistant
my_run = client.beta.threads.runs.create(
  thread_id=my_thread.id,
  assistant_id="asst_???",
  instructions="Please give the answer as only a number. Do not answer nothing. Give a percentage chance."
)
print(f"This is the run object: {my_run} \n")

# Step 6: Periodically retrieve the Run to check on its status to see if it has moved to completed
while my_run.status in ["queued", "in_progress"]:
    keep_retrieving_run = client.beta.threads.runs.retrieve(
        thread_id=my_thread.id,
        run_id=my_run.id
    )
    print(f"Run status: {keep_retrieving_run.status}")

    if keep_retrieving_run.status == "completed":
        print("\n")

        # Step 7: Retrieve the Messages added by the Assistant to the Thread
        all_messages = client.beta.threads.messages.list(
            thread_id=my_thread.id
        )

        print("------------------------------------------------------------ \n")

        print(f"User: {my_thread_message.content[0].text.value}")
        print(f"Assistant: {all_messages.data[0].content[0].text.value}")

        break
    elif keep_retrieving_run.status == "queued" or keep_retrieving_run.status == "in_progress":
        pass
    else:
        print(f"Run status: {keep_retrieving_run.status}")
        break

Is there anyway to use function calling or something like that to force the model to give an answer. I have created both Custom GPTs and Assistant API. Custom GPTs answer these types of questions. But Assistant APIs are preventing themself to answer. The knowledge based (file retrieval) has the information about the question. But still the model gives answer like this:

User: What do you think is the percent chance that 12 months from now the average interest rate on saving accounts will be higher than it is now?
Assistant: I'm sorry, but I cannot predict future interest rates or provide a percentage chance as my capabilities are limited to process only the data provided and not predicting future events or trends. Furthermore, my current environment does not have access to the internet, which limits my ability to extract real-time data or use machine learning models for such predictions.

Function calling and Retrieval are two different tools used by Assistants. When you say retrieval, I assume you mean you uploaded a File in your assistant. To induce the AI to use your file, try to add some instructions to let the AI know what to do in case some query comes up that needs file retrieval. It will not be enough though.

Another way that will help is to attach the file id in your message when you create a thread.

thread = client.beta.threads.create(
  messages=[
    {
      "role": "user",
      "content": "I need to solve the equation `3x + 11 = 14`. Can you help me?",
      "file_ids": [file.id]
    }
  ]
)

or when you create your messages

thread_message = client.beta.threads.messages.create(
  "thread_abc123",
  role="user",
  content="How does AI work? Explain it in simple terms.",
 file_ids= [file.id]
)
1 Like