I have a plugin that creates feedback loops to solve complex problems and code generation etc. It works great. But I get this random error, which isn’t really an error, and I don’t see the request on my server.
Not sure it is quite the same problem, although maybe related: I occasionally have a problem where chat says it is about to send a message to my plugin, then says ‘message sent’, but never in fact invokes the plugin. These both seem, to me, related to non-deterministic behavior by chatGPT. So:
- What temperature does chatGPT use?
- What error recovery mechanisms are available. Given the non-deterministic nature of prompt results, seems like we need some kind of try: except: functionality.
This is that issue I brought up a week back or so with you. Its the hallucination of apis by ChatGPT. Basically, cgpt is attempting to use an api that is not available from your openapi.yaml file and then throws this error. It has a tendency to do this more as the yaml file grows in size from my experience. One quick fix is to either make available/create the api that cgpt wants to use or rename whatever api you have that it is trying to use to the name it wants for it. There are other more elaborate fixes if you start to encounter this more frequently, but that should help if it is a one off or rare event.
Best,
Chase W. Norton
The best approach I’ve found seems to be a single proxy api endpoint that can return different data based on a natural language request. This a simple example that uses keywords, but you could use the OpenAi API to handle the command and control.
from fastapi import FastAPI, HTTPException
import httpx
app = FastAPI()
API_MAPPING = {
"api_1": "https://api1.example.com/data",
"api_2": "https://api2.example.com/data",
"api_3": "https://api3.example.com/data",
}
KEYWORDS_MAPPING = {
"api_1": ["keyword1", "keyword2"],
"api_2": ["keyword3", "keyword4"],
"api_3": ["keyword5", "keyword6"],
}
def determine_api(text: str):
for api_name, keywords in KEYWORDS_MAPPING.items():
for keyword in keywords:
if keyword.lower() in text.lower():
return api_name
return None
@app.post("/proxy")
async def proxy_api(text: str):
api_name = determine_api(text)
if api_name is None:
raise HTTPException(status_code=404, detail="No suitable API found based on provided text")
api_url = API_MAPPING[api_name]
async with httpx.AsyncClient() as client:
try:
response = await client.get(api_url)
response.raise_for_status()
except httpx.HTTPError as e:
raise HTTPException(status_code=500, detail=f"Error fetching data from {api_name}: {e}")
return response.json()
This script uses a KEYWORDS_MAPPING dictionary that maps each api_name to a list of keywords. The determine_api function takes the input text and checks if any of the keywords are present in the text. If a keyword is found, it returns the corresponding api_name.
After running the FastAPI application with the command uvicorn proxy_api:app --reload, you can access the Swagger UI at http://127.0.0.1:8000/docs.
The generated swagger.json file can be found at http://127.0.0.1:8000/openapi.json. (This the specification for OpenAi)
My sense based on a day’s worth of digging:
- yaml is not as straightforward as it seems, at least for quick copy-paste devs like me.
- (independently of 1): the simpler the yaml def, the less likely for chatGPT to screw up. For example, I haven’t been able to get it to correctly format a get with multiple query args. I’m just using quart_cors, YMMV (but I can curl the endpoint, so I don’t think that’s the issue)
Yaml is awful for a lot of reasons. I use a structured approach for my Ai dev using a spec I created called AiTOML. If you’re curious take a look at my GitHub below.
I think the openai has done a lot of things right, choosing swagger for the plug-in api specification was not one of them. It’s an old and outdated api approach.
Here my specification