Does Thread creation or deletion cause consuming tokens?

Hello!

I’m developing a mediator api for my company. Due to we can get all messages by requesting message list, I’m planning to create thread and run together with the message. wait until the run completed and get the messages. after that I will not need that thread anymore and I’m planning to delete the thread.
In above case which operation causes consuming tokens? is it just the message? or creating and deleting thread are also consuming tokens?

Thanks in advance

My understanding is as below

I use the below to create a message and then …

        messages = openai.beta.threads.messages.create(
            thread_id=thread.id,
            content=question,
            role="user"
        )

…run it as below. This is where I am submitting the request as a thread. The thread now goes through multiple stages…

        run = openai.beta.threads.runs.create(
            thread_id=thread.id,
            assistant_id=assistant_id)
        _, answer = self.get_answer(run, thread, type="")

…which I track using the below loop. When I get my desired status, I break out of the loop…

        while run.status != "completed":
            run = openai.beta.threads.runs.retrieve(
                thread_id=thread.id,
                run_id=run.id
            )

… and delete the thread

            status = openai.beta.threads.delete(thread_id=thread.id)
            if status.deleted:
                print("Deleting thread id: ", status.id)
                return True

The token consumption, according to me, only happens when the thread gets the answer from the LLM during the “in progress” status. It does not happen each time I try to retrieve the status of the thread.

I will look forward for others to comment and correct my understanding

Thanks for the great answer. I have a similar understanding but need to be sure about it. As my understanding only when we start a run it will consume tokens. even if we don’t request the answer. If anyone knows a better or certain information please share with us