Where is the compact description of your plugin sourced from?

In the docs, it says

  • OpenAI will inject a compact description of your plugin in a message to ChatGPT, invisible to end users. This will include the plugin description, endpoints, and examples.
  • When a user asks a relevant question, the model may choose to invoke an API call from your plugin if it seems relevant; for POST requests, we require that developers build a user confirmation flow.

Where does the compact description specifically come from? is it from the “description_for_model”: in the manifest?

It’s combination of the manifest which acts as an initial trigger, so max out the descriptions and including things like /start to begin in the description text. Next is the specification that tells the LLM what data, fields and other variables are available. It’s two step process you need to first (manifest) to get to the second api specification.