Assistant that gathers information and stores it or processes for some pre-defined documents


I need to create a solution that will gather some information from the user based on the conversation and then use a function to generate an output (like pre-filled survey or some kind of schema).

I initially tried with some Azure solutions like LUIS and QnA Maker, but it didn’t work out.

At this moment my solution is pretty simple, it is based on a similar topic - create-question-asking-chatbot-that-collects-user-information/657588 (as a new user I can’t paste full links) - it gathers information, calculates an insurance fare and generates a json file that contains all the necessary data.

Sample conversation:

And output:

    "location": {
        "city": "Boston",
        "state": "Massachusetts"
    "plan_details": {
        "plan_type": "family",
        "total_people": 3,
        "age_details": [
                "person_id": 1,
                "age": 12
                "person_id": 2,
                "age": 34
                "person_id": 3,
                "age": 45
    "pricing": {
        "base_price": 50,
        "additional_person_price": 30,
        "total_price": 110

I’ve had some issues, however, while writing the code, I started with simple assistants examples but had to move to the streaming solution; I based on platform.openai. com/docs/assistants/overview?context=with-streaming.

Obviously, there are still some issues, sometimes the assistant ‘forgets’ to ask for something and hallucinates so I will need to optimize the instructions.

Anyway - as I am fresh to the Assistants topic, I would like to share my code and ask if the more experienced users have any suggestions how to improve it or are there any bad practices / elements.

I also hope that it might be helpful for some people as the Assistants feature seems to have a great potential.

I use Azure OpenAI service, but I’m pretty sure it would work exactly the same with OpenAI.

import os
import json
from openai import AzureOpenAI, AssistantEventHandler

api_key = os.getenv("AZURE_OPENAI_API_KEY")
azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
api_version = "2024-02-15-preview"

client = AzureOpenAI(

tools = [
    {"type": "code_interpreter"},
        "type": "function",
        "function": {
            "name": "getplanprice",
            "description": "Needs the city/state, the type of plan, the number of people on the plan, and the ages of the people",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "object",
                        "properties": {
                            "city": {"type": "string"},
                            "state": {"type": "string"}
                        "required": ["city", "state"]
                    "plan_details": {
                        "type": "object",
                        "properties": {
                            "plan": {"type": "string", "description": "Individual or Family"},
                            "people": {"type": "integer", "description": "The number of people on the plan"}
                        "required": ["plan", "people"]
                    "age_details": {
                        "type": "array",
                        "items": {
                            "type": "object",
                            "properties": {
                                "person_id": {"type": "integer"},
                                "age": {"type": "integer"}
                        "description": "List of ages of all plan participants"
                "required": ["location", "plan_details", "age_details"]

def getplanprice(location, plan_details, age_details):
    city = location['city']
    state = location['state']
    plan = plan_details['plan']
    people = plan_details['people']
    ages = [age_detail['age'] for age_detail in age_details]

    # Example pricing logic
    base_price = 50
    additional_person_price = 30
    total_price = base_price + (additional_person_price * (people - 1))

    plan_summary = {
        "location": {
            "city": city,
            "state": state
        "plan_details": {
            "plan_type": plan,
            "total_people": people,
            "age_details": age_details
        "pricing": {
            "base_price": base_price,
            "additional_person_price": additional_person_price,
            "total_price": total_price

    return plan_summary

class EventHandler(AssistantEventHandler):
    def __init__(self, thread):
        self.result = None
        self.thread = thread

    def on_event(self, event):
        if event.event == '':
            run_id =
            self.handle_requires_action(, run_id)

    def handle_requires_action(self, data, run_id):
        tool_outputs = []
        for tool in data.required_action.submit_tool_outputs.tool_calls:
            if == "getplanprice":
                parameters = json.loads(tool.function.arguments)
                self.result = self.call_getplanprice(parameters)
                tool_outputs.append({"tool_call_id":, "output": json.dumps(self.result)})
        self.submit_tool_outputs(tool_outputs, run_id)

    def call_getplanprice(self, parameters):
            location = parameters['location']
            plan_details = parameters['plan_details']
            age_details = parameters['age_details']

            # Call the actual getplanprice function with parsed parameters
            return getplanprice(location, plan_details, age_details)
        except Exception as e:
            print(f"Error in getplanprice: {e}")
            return {"error": str(e)}

    def submit_tool_outputs(self, tool_outputs, run_id):
            with client.beta.threads.runs.submit_tool_outputs_stream(
                event_handler=EventHandler(self.thread)  # New event handler instance
            ) as stream:
                for text in stream.text_deltas:
                    pass  # Process text deltas if needed
        except Exception as e:
            print(f"Failed to submit tool outputs: {e}")

def save_response_to_file(response):
        with open('response.json', 'w') as file:
            json.dump(response, file, indent=4)
        print("💾 Response saved to 'response.json'")
    except Exception as e:
        print(f"Failed to save response: {e}")

def create_assistant():
        return client.beta.assistants.create(
            instructions="""You are a helpful telecom sales assistant. 
            In order to help someone find the right plan, you need the city a person is based in and whether the person is looking for an individual plan or a family plan. 
            For an individual plan, just ask the age of the person.
            If it is a family plan, we need to know how many people are on it and what their ages are for each person. The max for a family plan is set to 5. In order to provide a quote, you will need to have the ages of all people on the plan.
            When people provide only the city name, please infer the state and then confirm with the person. For example, if they say 'Atlanta', you ask something like 'So you are in Atlanta, Georgia, right?' - do it BEFORE generating the plan.
            Once you have all the information (Individual or family plan, number of people and age for EACH person, city, and state) you can call the plan function that will return the price.

            When you do so - write that the plan for you is ready, thank you and goodbye.
    except Exception as e:
        print(f"Failed to create assistant: {e}")
        return None

def main():
    assistant = create_assistant()
    if not assistant:

    thread = client.beta.threads.create()
    print("💬 Assistant: Hello! Tell me what you need.")
    while True:
        user_input = input("➡️ You: ")
        if user_input.lower() == 'exit':
        message = client.beta.threads.messages.create(, role="user", content=user_input)
        event_handler = EventHandler(thread)

            ) as stream:
        except Exception as e:
            print(f"Failed to process request: {e}")

        messages = client.beta.threads.messages.list(
        latest_message = messages[0]
        print(f"💬 Assistant: {latest_message.content[0].text.value}")

        if event_handler.result:

if __name__ == "__main__":

Oh - one more thing! It seems to work much better with the GPT 3.5 Turbo than GPT 4 versions. In the latter case the bot needs much more time to process the questions and often tends to reply with my own freetexts - I am not sure why is it happening and would also appreciate some help from more experienced users.