ToolAssistantToolsFunction and Tool no longer defined?

What should I now be doing instead of

from openai.types.beta.threads.run_create_params import ToolAssistantToolsFunction, Tool

See,

@elmstedt I saw that but wasn’t clear what to do about it. What’s the correct import?

This is what is provided by openai.types.beta.threads.runs,

from openai.types.beta.threads.runs import (
    CodeInterpreterLogs,
    CodeInterpreterOutputImage,
    CodeInterpreterToolCall,
    CodeInterpreterToolCallDelta,
    FunctionToolCall,
    FunctionToolCallDelta,
    MessageCreationStepDetails,
    RetrievalToolCall,
    RetrievalToolCallDelta,
    RunStep,
    RunStepDelta,
    RunStepDeltaEvent,
    RunStepDeltaMessageDelta,
    ToolCall,
    ToolCallDelta,
    ToolCallDeltaObject,
    ToolCallsStepDetails,
)

@elmstedt Sorry to be dim but where did ToolAssistantToolsFunction and Tool go? What has replaced them if they’re gone?

I’ll admit, I’m not the best person to ask—I’ve only toyed with the assistants endpoint, I do my own RAG with chat/completions.

I’m sure you’ve already seen these, but I recommend you look at the code samples in the API reference,

https://platform.openai.com/docs/api-reference/assistants

and the quick-start guide,

https://platform.openai.com/docs/assistants/overview

again.

Unfortunately the assistants playground doesn’t have a “view code” feature like chat/completions does, so you can’t build it up with a GUI and extract the equivalent code.

As I said, I’m not the best person to ask about assistants as I only played with it briefly when it was first announced.

If @_j is around, of the people who are regularly here, they’re probably the most knowledgeable and experienced person with respect to the assistants endpoint and they’re also among the most helpful by far.

I’ve tagged them in the hopes they can drop in and quickly answer your questions.

In the meantime, if you are willing to post some more complete code of the instantiation of your assistant, I (or more likely someone else) might be able to see better exactly what you need to change to get up and running again.

It looks like OpenAI has done a complete scrambling of the hierarchy of the python library. There is no more beta subdirectory or clean hierarchy of endpoints. 4,454 additions and 488 deletions - something only the magic of an automated tool could bring.

And then to have them callable “beta”:


class ChatProxy(LazyProxy[resources.Chat]):
    @override
    def __load__(self) -> resources.Chat:
        return _load_client().chat


class BetaProxy(LazyProxy[resources.Beta]):
    @override
    def __load__(self) -> resources.Beta:
        return _load_client().beta

...
chat: resources.Chat = ChatProxy().__as_proxied__()
beta: resources.Beta = BetaProxy().__as_proxied__()

If you don’t need to rewrite everything for assistant/tool streaming and figure out what’s been done, you can pip install openai==1.13.4 to keep the last version around.

The readme 4 days ago refers to this bunch of mods for assistants, “streaming helpers” - that you’ll now find in the readme after figuring out for yourself what is new.
Methods like “client.beta.threads.runs.create_and_stream”

So it will take some digging in later to put the new code into the old brain.

I don’t know why you’d want to individually import a dozen methods, unless you are also renaming them for subclassing. A whole bunch of imports - and then you have to import the whole library anyway to get the error code handling.

Assistants is also a big “don’t care” from me.

2 Likes