Multi function call - ask for parameters one after another if first called function was returned with success

Hello guys,

I’m trying to figure out a solution for creating such scenario where chatGPT asks for one parameter after another (smth like: ask for login → if provided from user login exists → ask for password):

  1. User asks for his bill information
  2. Chatgpt runs function calling → asks for eg. his e-mail /login/ iD
  3. User provides login that is let’s say checked by function calling in DB
  4. If previous step was returned with success (user exists in DB) then chatgpt confirm that provided login is correct and should ask for next data that is needed to authorize - his secretID/password (whatever) to get access to his account

Does anyone have idea how to implement such flow / scenario?
How to make chatgpt ask for next data from user imidiately after executing function call, without next user input/asking for something?

EDIT with more informations about the goal:

My goal is to create conversational bot with whom you can talk about lot of topis, but that can also run function-calling to serve you some informations from DB after some of user questions for example: “check my bill status”, but I want to be able to make two step verification(login → if login exists → ask for password/secretID). So not only to ask for one data/parameter(as in ‘whats the weather’ examples - where only location is crucial).

So after that kind of user input:
a) LLM will use function-calling and ask for required data - so lets say in first step we need his login
b) in mentioned earlier function I will have implemented some validation to check if provided login exists (may be DB query, whatever)
c) if provided by the user login exists LLM will thank for the provided data and ask for second data that is needed
d) user will provide second parameter
e) LLM will run second function

I lack ideas how to make points c,d,e happen

PS: It’s also not a solution for me to ask for both parameters at the start of the first called function, because there are no places where user provides both: login and password in one input/line :smiley:

Thanks in advance for your support.

1 Like

Hi and welcome to the Developer Forum!

You do not need a Large Language Model for this task, it is structured data capture, you present the user with a form requesting their name, address, etc.

Attempting to capture user data in an unstructured conversational manner is fraught with edge cases and potential issues, you can tell the AI to gather specific input elements and it will ask for them, but the validation and and processing of that information must either be done in a subsequent API call, or you can simply present the user with a details form as part of your code flow.

Thank You Foxabilo for Your reply.

I have edited my first post and added some additional informations about the goal that I want to reach. I was hope to create something fancier than a form to fill with informations.

Is that kind of task not possible to complete with LLM-function calling?
Should it be done in other way as You have mentioned?

It is absolutely possible to gather this data via an LLM question chain, you prompt the LLM to gather the required information, the model will ask for it, the user types a response and then you can ask the model if it thinks the user entered all of the correct data.

One of the issues will be, how do you know the data returned is actually valid without a rigorous user input filtering and checking system, the answer in this case is you do not, so you will need to check the users input after they have typed it and if there is an issue you will need them to type it again, so you go from the user being able click on their input and correct an error, as happens with standard forms, to you requesting that the user re-type an answer from scratch.

If your main use of this is to take advantage of the novelty of AI in your product, that is fine, just understand it is a suboptimal way to gather data.