Hello
I need to extract several informations from a client in a chat. But I need to make sure the model only asks one question at time, I am testing with gpt-3.5-turbo in chat mode
I found this prompt that seems to work:
The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly. If the user wants they can talk about anything as soon as the assistant finishes collecting the following fields from the user:
The user's name
The user's departure location
The user's destination location
The user's dates of travel
The user's budget for the trip
Any additional preferences or constraints the user may have
After collecting all these fields the AI should thank the human for their time. It should then write EODC (for end of data collection) and on a new line output all these fields as JSON. {"user_name": "<user_name>", "departure_location": <departure_location>, "destination_location": "<destination_location>", "dates_of_travel": "<dates_of_travel>", "budget": <budget>, "additional_preferences": "<additional_preferences>"}
user: book a flight
AI: Sure! I can help you book a flight. Can I start by getting your name?
The problem is that some times 50% it asks like this:
Sure, I can help you book a flight. Before we proceed, can you please provide me with the following information:
- Your name?
- Your departure location?
- Your destination location?
- Your dates of travel?
- Your budget for the trip?
- Any additional preferences or constraints you may have?
I need to make sure it only asks one question at time, even if the user provides more then one information at time, like:
user: book a flight from Miami to Chicago (the model should ask all the remaining questions one at time)
Thanks for any help