hi! i’m trying to call the openai API to generate some text based on some input. it worked fine until i deployed my website on render and got CORS errors.
right now i have a frontend that sends a json input to my backend inside the body a post request. then the backend makes the openai API call using the json data as an input. the backend then sends a response back to the frontend with the openai output.
i think it’s specifically the call to the openai api that breaks things; when i comment it out, the error goes away.
the error:
Access to fetch at (backend endpoint) from origin (frontend website) has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
i’ve tried proxying the openAI api by using:
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY, httpAgent: new HttpsProxyAgent.HttpsProxyAgent([OpenAI completions endpoint])});
i’ve also tried using external proxy middleware:
const apiProxy = createProxyMiddleware({ target: [OpenAI completions endpoint]});
...
app.use(['/roastArtists', '/roastTracks'], apiProxy)
... (all other code the same as below)
backend code:
app.post('/roastTracks', async function(req, res) {
// TODO: implement try/catch so server doesn't just crash lmfao
// console.log(req.body);
let topTracks = req.body
let topTracksStr = JSON.stringify(topTracks);
// console.log("generateRoast, topArtists:", topTracks);
// console.log("generateRoast:",topTracksStr);
console.log("sending to chatGPT...")
const completion = await openai.chat.completions.create({
messages: [
{ role: "system", content: process.env.TRACKS_PROMPT },
{ role: "user", content: topTracksStr}
],
model: "gpt-3.5-turbo",
});
console.log("finished!")
console.log(completion.choices[0]);
res.send({
gpt_response: completion.choices[0]
// gpt_response: {"message" : { "content" : topTracksStr}}
})
// console.log("GPT response:", completion.choices[0]);
});
frontend (sorry for messy code)
async function roastArtists(time_range) {
console.log("roast artists:", time_range);
setResponseState(LoadingState.LOADING);
// not sure if i did this right lol
spotifyApi.getMyTopArtists({ limit: 5, time_range: time_range })
.then((response) => {
let topArtists = []
for (let i = 0; i < response.items.length; i++) {
topArtists.push(response.items[i].name);
}
return topArtists;
}, function(err) {
console.log('Something went wrong!', err);
setResponseState(LoadingState.INPUT);
})
.then(async (topArtists) => {
// proxy request to backend to ask for chatGPT output
console.log("[generateRoast()] fetching gptResponse")
const gptResponse = await fetch(BACKEND_ROUTE + "/roastArtists", {
method: "POST",
headers: {'content-type' : 'application/json'},
body: JSON.stringify({"topArtists" : topArtists})
})
return gptResponse;
}, function(err) {
console.log('Something went wrong!', err);
setResponseState(LoadingState.INPUT);
})
.then((res) => res.json())
.then(async (gptJson) => {
let gptRoast = gptJson.gpt_response;
setResponseState(LoadingState.OUTPUT);
setRoast(gptRoast.message.content);
}, function(err) {
console.log('Something went wrong!', err);
setResponseState(LoadingState.INPUT);
})
}
Thanks!