Prompting for Scheduling Appointments

I’m new to prompting via API and find it challenging to reproduce consistent and non-conflicting results.

My project is to create a scheduling assistant to gather user information and provide appointment times.

  • How descriptive should I be in my system instructions?

  • Can I tell it to reference a specific document for steps/processes to schedule an appointment?

  • Are there best practices to collecting information needed? (Can I list a bunch of required information like “1. Ask their full name 2. Ask their phone number 3. Ask user if they want in-person or online appointment.”)

Clarity and Specificity are key in the instructions to the API , they should be as descriptive as necessary to ensure accurate understanding and response. If your process involves complex steps or conditions, detailing these in your instructions will help in getting more accurate outputs. Ambiguity in, ambiguity out. Regarding referencing documents, I don’t think the current OpenAI API ca directly process external documents, but you can integrate document content into your API requests (copy paste). For instance, if you have a standard procedure in a document, you can just put them in in your API request as context.

You can definitely instruct the API to collect specific pieces of information in a sequence, like:

Ask for their full name.
Request their phone number.
etc… should all work fine.

Handling Variability in responses in important though, outliers and users who provide information in unexpected formats or skip questions should be accomodated so prepare the instructions to handle these variabilities gracefully.

Be sure standardize your interaction flows as much as possible. **use similar language and terminology, as well as interaction patterns throughout the process. Consistency in, consistency out.

Incorporate feedback mechanisms to continuously improve the assistant’s performance based on user interactions and experiences. Design your system to handle errors or uncertainties effectively. If the assistant is unsure about a user’s input, it should know how to ask clarifying questions. This one is tricky, your mileage may vary.

1 Like
  1. more descriptive is better. With the new assistants you will see that your prompt (as saved in the assistant!) can be very long, relatively speaking. More specific is better.
  2. yes you can. Upload the document(s) to the assistant and reference them in your prompt from 1)
  3. yes - see one.

My prediction is that the biggest challenge will be to get it do things in ‘steps’ - ie you might need another assistant to prompt the first one to do the next thing

2 Likes
  1. Cool! So asking it to reference the steps in your document does it then omit the long text inside the prompt call?

Thank you! I’m new to this - that helps!

Few follow ups:

Handling Variability in responses in important though, outliers and users who provide information in unexpected formats or skip questions should be accomodated so prepare the instructions to handle these variabilities gracefully.

Is this done through system in system instructions or via a validation function (or both?)

Also, does adjusting the temperature value have any value in keeping the bot more consistent and predictable?

Not sure what you mean by that - that depends on what your prompt says about it?