Developing GPT action api on localhost

Trying to develop action apis but it seems unlike plugin the action doesn’t allow localhost development. I’m wondering how people are getting around that, any suggestions are welcome!

You can use, free tier, to create a Cloud Edge, which will give you a URL you can use in the openapi spec. Then you can run ngrok.exe locally to create a tunnel from the ngrok server to your localhost:

./ngrok.exe tunnel --label edge=edghts_fjsiasuhjkerwgjZMw http://localhost:80

This also downgrades from HTTPS which OpenAI sends requests in, to simpler HTTP which I think is easier to receive, but maybe theres a safety consideration. But probably yo ucan leave it as HTTPS if a person wants to.

Oh and your edge=ed[…] value would be unique to your account/endpoint. Also you have to do an authroization step in ngrok first.

But now GPT action traffic should go from OpenAI to the ngrok server, then tunneled to your PC through the ngrok.exe process (I think?) and then show up on port 80, or whatever port you specify in the command earlier.

1 Like

use local tunnel

Its like ngrok but better.

ask GPT to explain it to you.

1 Like

Using ngrok for now, would love to give local tunnel later. Thanks man!

yeah no problem.

I use localtunnel because i can select a constant subdomain, rather than constantly enter it.

I can embed it in my server so i dont have to re-set that address.

But to each their own.

1 Like