Create question-asking chatbot that collects user information?

Hi all,

I’ve been trying for a month to create a chatbot with openai api that gets user information by asking some questions to the user so it can find the best option of a certain plan (say, a phone plan, or health plan). It should ask for the:
name,
city,
if it’s an individual or family plan,
how many people in the plan,
the age of the individual, or individuals

It should always try to return to the collecting stage if the user deflects to answer, or changes subject,
It has to infer that the individual plan is for one user (and do not ask for the amount of users in this case),
It has to infer the state by city name,
If the user inputs a amount of ages that do not match the number of individuals in that plan, it should ask to correct that information,
It should try to ask in a order, but do not mind if the user answers out of order,

It has to sound like a human at all times.

It should, by the end, summarize the entire information and return a json with it.

I’ve tried it all!, say:
Simple step by step prompting;
Langchain with tagging chain;
dynamically change the system role with a list of informations still missing and what we’ve already got;
fine-tuning with dozens of example conversations;
etc.

It always has a weak spot. It forgets to ask some questions; or the tagging chain breaks due to field’s type error; it gets too sassy or repetitive with fine-tuning; etc.

Has anyone here ever build such a chatbot? Do you have any tutorials or information to do so? I found it hard to find it, except for an youtube video called HOW to Make Conversational Form with LangChain | LangChain TUTORIAL

Thanks!

3 Likes

I know this is going to sound convoluted but have you tried creating a Custom GPT in the MyGPT section that is able to perform the steps you are asking (Give it a loop to follow to ensure it gets just the info you want and nothing else) and then setup a Assistant to reach out to this Custom GPT and then set your API calls to go to the Assistant?

2 Likes

I’ve not tried that. Is there a tutorial you know about that teaches this?

1 Like

It says “You do not currently have access to this feature”. I’ve read it’s a problem even with plus members.

This seems to be a perfect case for an Assistant.

Go to platform.openai.com and create an Assistant.
Here’s the prompt I created for you:

You are a helpful telcom sales assistant. In order to help someone to the right plan you need the city a person is based on wether the person is look for an individual plan or a family plan. 

For an individual plan- just ask the age of the person. 

If It is a family plan - we need to know how many people are on it and what there ages are for each person. 
The max for a family plan is set to 5. 
In order to provide a quote - you will need to have the ages of all people on the plan.

When people provide only the city name, please infer the State and then confirm with the person. Like when they say 'Atlanta' - you ask something like 'So you are in Atlanta, Georgia right?

Once you have all the information (Individual or family plan, number of people and age for EACH person, city and state) you can call the plan function that will return the price. 

And a demo of the function profile:

{
  "name": "getplanprice",
  "parameters": {
    "type": "object",
    "properties": {
      "plan": {
        "type": "string",
        "Description": "Individual or Family"
      },
      "people": {
        "type": "integer",
        "description": "The number of people on the plan"
      },
      "ages": {
        "type": "string",
        "description": "Commma separates list of the ages of all plan participants"
      }
    },
    "required": [
      "plan",
      "people",
      "ages"
    ]
  },
  "description": "Needs the city/state, the type of plan, the number of peole on the plan and the ages of the people"
}

You can then run it in the Sandbox.
Seems to do pretty fine - here is a bit of an interaction I was having just now.

User

I am looking for a plan

test planner

Of course! I'd be happy to help you find the right plan. Could you let me know if you're looking for an individual plan or a family plan? And also, which city and state are you based in?

User

What is the difference?

test planner

The main differences between an individual plan and a family plan are as follows:

1. **Individual Plan:** This is designed for a single user. It focuses on meeting the needs of one person and is tailored to their usage patterns, such as call time, data usage, and text messages. It's perfect if you are the only one using the service.
2. **Family Plan:** This is intended for multiple users, typically within the same family, though some plans allow for friends or employees to be included as well. Family plans often provide a better value per user compared to individual plans because the cost of the plan is shared among multiple people. They include shared or individual data allocations, minutes, and texting capabilities for each member. Family plans can cover up to a specified number of people, typically around 5.

Choosing between the two depends on your needs. If you're the sole user, an individual plan might be more suitable. If you're looking to include multiple people, such as family members or a small group, a family plan would likely offer more value and convenience.

Could you let me know which one fits your needs best, and also provide the city and state you are in?

User

Ok - I want a family plan!

test planner

Great choice! To get started on finding the best family plan for you, could you please provide the following details:

1. How many people will be on this family plan? (Please note that the maximum is set to 5.)
2. The ages of each person who will be on the plan.
3. The city and state you're based in.

This information will help me to find the most suitable family plan for you.

User

We are 3 people

test planner

Got it, a family plan for 3 people. Could you please share the ages of each person who will be on the plan? Additionally, I still need to know the city and state you're based in to ensure we find a plan that offers the best coverage and pricing for your area.

User

21 and 41

test planner

It looks like we have the ages for two people so far: 21 and 41. Since you mentioned there are 3 people in the plan, could you please share the age of the third person?

Also, to complete the setup and find the best plan options, could you provide the city and state you're based in? This information is essential to ensure we get the correct coverage options for your location.
2 Likes

I don’t want to be negative Nancy here, but just one small point to add is that you also want to be mindful of the privacy obligations that come with processing certain types of personal information, especially as it relates to healthcare.

See here (section on Privacy):

1 Like

The health care thing was just an example. I can’t actually tell what’s about, but it’s nothing nefarious lol. Also, the chatbot will only contact users that have previously filled a form consenting to being contacted by the company to ask some follow up questions. And it’ll only ask some basic questions to check if the client is still interested and then it’ll redirect to a real salesman. Thanks for the concern, though!

1 Like

I’ll try that! I have already tried assistants before actually, but I had to create some validations that required something a bit more sophisticated and migrated to Langchain. But I now realized that I could use some function calling to do that. And could use your prompt model to refine my own.
Thanks

Is is possible to make the assistant ask the user the questions and not the other way around? And can I constantly verify if the information was gathered, or only after it gets it all?

1 Like

Sounds good. Don’t get me wrong here, just wanted to make sure you don’t face any issues down the road.

1 Like

Yes you can start with an opening question of course.

I am not sure why you want to ‘constantly verify’ - that kinda defeats the purpose of have a bot handle it? But you could of course add another function that you tell the Assistant to call after each question. But I would not recommend doing that. Better to just get the right prompt and let the Assistant do what it is made to do.

1 Like

I have built something similar and let me tell you. It will not work as you expect. You’re giving way too much control to the model. You can validate each answer using typical database functions. You can use a single instance of GPT to spin the question, you can use GPT to compile the answers into something.

If you want to gather answers there’s no practical reason to have it as a chatbot. You can imitate a chatbot though

I’ll be releasing an open source interface for this shortly if you are interested. Hooks up to Supabase

1 Like

What do you mean by “imitate a chatbot”? Do a regular form that detects the user input, but only asks pre-planned questions?

This is very easy to do, just use function calling. (Which is just a JSON response)

Your use case is non-trivial for several reasons. It’s also inadvisable if you’re operating in the EU or California as collecting personal data via a hosted LLM will certainly (if not already right now, definitely as soon as upcoming legislation lands) place you in hot water.

Some of the issues you will encounter include: data input validation, LLM hallucination (including answering questions on behalf of the user), LLM losing its place in the question steps, the requirement to perform a (not 100% robust) function call to send data back to you or another service for processing, hitting context length caps, hitting usage limits (if using an OpenAI GPT), latency affecting response time, etc.

We solved it by using our own fine-tuned Open Source LLMs, and creating a directed process written with Langchain and an integration automation workflow system to mediate the interaction.

Unlikely you will be able to create a robust service without building a lot of code around the LLM to handle exceptions and edge cases.

Using OpenAI Assistants might help you gain some level of control, but then you’re still at the mercy of what OpenAI will do with any input data - you definitely don’t want personal data leaking in future versions of ChatGPT because they finetuned or RLHF’d a model using user data (their privacy policy is suspiciously ambiguous about what “training” actually refers to).

Best of luck.

2 Likes

I’d first try solving the problem using a non llm solution, which probably involves a flow chart with predefined questions / responses.

Once you have that you can substitute out each individual predefined question / response
with a llm query with instructions containing the predefined question (what type of plan?) allowed responses (single, family, etc.), and a user query with the users transcript. Have the llm return a response in json or something then validate.

Then, error rates be forgiving, you can move to a more llm based approach where you give the llm the entire transcript, and prompt it for the answer to each question. Tell it in the instructions if it can’t find the answer to respond NOT FOUND. Whenever it doesn’t find a response you can ask the user and append their response to the transcript and retry.

I’d definetly add at the end a confirmation for the user to validate the bots responses.

And overall i’d choose a more algorithmic approach to this problem. Maybe using a llm on a per question basis and force the user to answer in your structured flow order.

best of luck

1 Like

I am interested in the open source solution. Please share it sounds interesting with the Supabase integration.

1 Like

Thankyou you must also so much helpful and thankfull also respectful.

A somewhat good solution will be combining a traditional bot with an LLM. For the Question flow + more natural conversation.