I want to build a bot that helps users find jobs, the first step is profile building.
I’d like for ChatGPT to have a “conversational” feel like you’re talking to a friend. Based on the docs, I got an MVP up and deployed, seeing ChatGPT ask profile questions based on the query params of a single endpoint.
This means if there are 10 params supported by the endpoint, ChatGPT will ask for all 10 in a single response. I’d like for ChatGPT to ask one at a time, with each back-and-forth returning more accurate data, instead of bombarding the user with 10 things to answer.
The only thing I can think of is to have a single endpoint for each param, which would not be ideal.
Has anyone gotten a conversational flow in their plugins?
I am also curious about this! Ideally we can start showing some results after required information, but still can ask for optional information and continue to show refined results based on new information collected
We have a customer at PluginLab who had a similar request.
We helped him to control de flow of his plugin.
Even though it didn’t sound as complex as the usecase you described, what was game change was:
Add a very sharp and exhaustive description_for_model.
Add a description field for every endpoint in your open API spec.
Regarding the description_for_model, keep in mind you can add 8000 chars. Which means you have to say A LOT. Add anything that would help the Plugin to understand the expected flow here.