How to handle staging and prod environments?

Following the plugin instructions, we have a .json manifest with a static value for the URL to get the API schema.

If we have a staging domain with a different URL, we’ll get an error like API URL is not under when trying to install the staging plugin.

Are people serving different manifest files depending on which domain the user is coming from?

1 Like

I have some code that dynamically injects values into the manifest at run time.

My manifest file has placeholders:

  "schema_version": "v1",
  "name_for_human": "Chess",
  "name_for_model": "Chess",
  "description_for_human": "Unleash your inner chess master with this interactive chess experience! You can play against a novice or a grandmaster!",
  "description_for_model": "Plugin for playing chess. Send moves to the plugin and display the results using the 'display' field. Ask the user what level they would like to play at and what color they would like to play.",
  "api": {
    "type": "openapi",
    "url": "PROTOCOL://PLUGIN_HOSTNAME/openapi.yaml",
    "is_user_authenticated": false
  "logo_url": "PROTOCOL://PLUGIN_HOSTNAME/logo.png",
  "contact_email": "",
  "legal_info_url": "PROTOCOL://PLUGIN_HOSTNAME/terms.html"

And in my code I have this - I’m using Python - but other languages will have similar functionality.

    def serve_ai_plugin():
        with open(".well-known/ai-plugin.json", "r") as f:
            data =
            data = data.replace("PLUGIN_HOSTNAME",
            data = data.replace("PROTOCOL", request.scheme)
            # get the json
            json_response = json.loads(data)
            # fill in the auth settings
            # for localhost we can only do "none"
            if "localhost" in
                json_response["auth"] = {"type": "none"}
                json_response["auth"] = {
                    "type": "service_http",
                    "authorization_type": "bearer",
                    "verification_tokens": {
                        "openai": os.environ.get("OPENAI_VERIFY_TOKEN")
            return jsonify(json_response)

Thanks for the confirm @iamflimflam1

I ended up doing something similar and believe it’s working right now.

Hey @pthieu ,

I believe you can easily solve this issue by using our service (

Since we act as a middle layer between your backend and ChatGPT, you could use any backend and we would provide you a preview domain for your dev/staging environment as well as a custom domain for your production environment.

I would be happy to chat about it :slight_smile: