Excited to announce Open Responses docs.julep.ai/responses/quickstart – a self-hosted OpenAI’s new Responses API alternative that you can customize and with ANY LLM model / provider and not just with OpenAI Responses API. What’s more is that this is also compatible with the agents-sdk so everything just works!
To try it out, just run npx -y open-responses init
(or uvx
) and that’s it!
We’d love feedback from the OpenAI community on how it integrates with your pipelines (support for streaming and voice agents coming soon!). Check out docs.julep.ai/responses/quickstart, and the repo: julep-ai/open-responses. Let’s push open-source AI forward together!