Cloudflare pages: "TypeError: adapter is not a function"

Hello Community,

I am calling the openAI gpt3.5turbo API using the openAI javascript module.
In my SvelteKit app, I have a server-side .ts script with the following code that calls OpenAI. This script works without any issues in local but when I deploy on Cloudflare pages, I am getting this error:
“TypeError: adapter is not a function”. I don’t see any other message and the application just freezes. Any clue what is going on?

export async function getCompletion(listOfMessages: ): Promise {
if (OPENAI_SWITCH == ‘false’) {
return {
role: “system”,
content: “Coach is chilling and unavailable at the moment.”
}
}
initializeOpenAI();
let messages = createMessageObject(listOfMessages);
console.log(messages);
try {
const completion = await openai.createChatCompletion({
model: OPENAI_MODEL,
messages: messages,
});
// console.log(completion);
console.log(completion.data.choices);
console.log(‘model’, completion.data.model);
console.log(‘prompt tokens’, completion.data.usage.prompt_tokens);
console.log(‘completion tokens’, completion.data.usage.completion_tokens);
console.log(‘total tokens’, completion.data.usage.total_tokens);
console.log(‘return value’, completion.data.choices[0].message);
return completion.data.choices[0].message;
}
catch (err) {
console.log(“Error when calling OpenAI”,err);
}

}

3 Likes

hi @mkpanchaal there is nothing on this code that seems to be causing that adapter error but I did find a few issues with this code. Here is the revised code and I have the changes listed below.

// Add the missing parameter type for listOfMessages and return type for the Promise
export async function getCompletion(listOfMessages: Array<any>): Promise<any> {
  if (OPENAI_SWITCH == "false") {
    return {
      role: "system",
      content: "Coach is chilling and unavailable at the moment.",
    };
  }
  initializeOpenAI();
  let messages = createMessageObject(listOfMessages);
  console.log(messages);
  try {
    const completion = await openai.createChatCompletion({
      model: OPENAI_MODEL,
      messages: messages,
    });
    // console.log(completion);
    console.log(completion.data.choices);
    console.log("model", completion.data.model);
    console.log("prompt tokens", completion.data.usage.prompt_tokens);
    console.log("completion tokens", completion.data.usage.completion_tokens);
    console.log("total tokens", completion.data.usage.total_tokens);
    console.log("return value", completion.data.choices[0].message);
    return completion.data.choices[0].message;
  } catch (err) {
    console.log("Error when calling OpenAI", err);
  }
}
  1. Added the missing parameter type for listOfMessages and return type for the Promise:Original: export async function getCompletion(listOfMessages: ): Promise { Revised: export async function getCompletion(listOfMessages: Array<any>): Promise<any> {This change specifies that listOfMessages is an array of any type and the function returns a Promise that resolves to any type.

  2. Changed the single quotes (‘’) around false to standard double quotes (""):Original: if (OPENAI_SWITCH == ‘false’) { Revised: if (OPENAI_SWITCH == "false") {Using standard double quotes prevents potential syntax errors.

thanks for the response, @saad.codes . I applied those suggested changes and unfortunately they didn’t resolve this issue. I suspect this is something to do with openAI npm package and its ability to run on Cloudflare pages - if anyone has any clue around this, it would be great.

For people who stumble here for the same problem:

  1. Cloudflare uses edge functions - regular npm openAI library won’t work there since it uses Axios - It will throw the error mentioned above.
  2. Instead I wrote my own wrapper that uses the Fetch function and calls https://api.openai.com/v1/chat/completions directly.
  3. bonus benefit - reduced my build size by 1 MB.
3 Likes

Thanks for digging in, @mkpanchaal. Just leaving a quick note to mention that I have the same use case (OpenAI + Cloudflare Workers) and am also encountering the same issue.

This worked for me:

export default {
  async fetch(request, env, ctx) {
    const url = 'https://api.openai.com/v1/chat/completions';
    const requestOptions = {
      'method': 'POST',
      'headers': {
        'Content-Type': 'application/json',
        // https://developers.cloudflare.com/workers/platform/environment-variables/#add-secrets-to-your-project
        'Authorization': `Bearer ${env.OPENAI_API_KEY}`,
      },
      'body': JSON.stringify({
        'model': 'gpt-3.5-turbo',
        'messages': [{'role': 'user', 'content': 'Hello, world!'}],
      }),
    };
    const response = await fetch(url, requestOptions);
    const json = await response.json();
    const responseOptions = {
      'headers': {
        'content-type': 'application/json',
      },
    };
    return new Response(JSON.stringify(json), responseOptions);
  },
};
2 Likes

I’m running into the same issues.

I’ve read somewhere else that this is because openai uses axios instead of fetch. So this can be done:

import fetchAdapter from "@vespaiach/axios-fetch-adapter";

// ...

  const configuration = new Configuration({
    apiKey: apiKey,
    baseOptions: {
      adapter: fetchAdapter
    }
  });

which worked for me :slight_smile:

I added "@vespaiach/axios-fetch-adapter": "*" in my package.json dependencies also