Turns out you have to stuff the payload into the url, ie https://api.openai.com/v1/engines/curie/completions/browser_stream?prompt='Once upon a time'&max_tokens=3
Thanks for the reply! I tried that but the _stream endpoint only accepts GET requests which can’t contain a body. I think stuffing it into the url is the only way…
It seems like you can use stream: true for the regular POST completion endpoint but use this library:
They actually recommend it in the documentation:
If you’d like to stream results from the POST variant in your browser, consider using the SSE library.
If you look into the code, it seems like it’s listening to the progress event when opening up an XHR instance and then parsing that data into tokens, so I’m guessing setting stream: true in the POST request case is just enabling chunked transfer encoding on the response.
Since streaming responses on a POST request can happen on the server, I would personally opt for that so you don’t have to disclose your private key to anyone on the client (if you plan on shipping what you’re working on to users) vs. using server sent events or any browser implementation.
Are you running this in the browser or on the server? You should only use the sse.js library on the browser. If you’re on the server (in Node), you can use node-fetch and use the res.body stream to get data as it comes:
If you’re using this in the browser, could you record a loom or tell us exactly which errors you’re getting?