Nodejs returns 401 error (running locally) [Solved]

Hi,
Not sure why I’m getting the error
{"message":"Request failed with status code 401","name":"Error","stack":"Error: Request failed with status code 401\n...
I have .env file with the API key:
OPENAI_API_KEY="sk-fU....."
my server.js calls: (I tried including the org key as well)

const configuration = new Configuration({
  organization: "org-xxxxxxxxxxxxx",
  apiKey: process.env.OPENAI_API_KEY,
});

clear the cache and tried with 3 different browsers on Mac Ventura (intel).
I have double and triple checked my API key but can’t figure this one out.
Any pointers?

Welcome to the community!

A 401 errors means it couldn’t authenticate. You need to send the API key in the header as a bearer…

This page might be helpful? They have a working example for nodejs…

Hope this helps.

ETA:

  const headers = {
    'Authorization': `Bearer ${process.env.OPENAI_SECRET_KEY}`,
  };

Thank you @PaulBellow!
it works now :smiley:

1 Like

you can add this in the script file in which you’re calling API
const response = await fetch(’ http://localhost:5000’, {
method: ‘POST’,
headers: {
‘Content-Type’: ‘application/json’,
‘Authorization’: Bearer ${process.env.OPENAI_SECRET_KEY}
},
body: JSON.stringify({
prompt: data.get(‘prompt’)
})
})

make sure to check what URL you’re hitting, you’ll be using local server so request local server to get response

Hi @humayounshah, I’m using expo managed and my code has been working until today. Last tested on the 2nd of January and all was well. Note I’m not using nodejs. The api call is happening on the client’s side.

const configuration = new Configuration({
apiKey: “sk-_________________”
});
const openai = new OpenAIApi(configuration);

const generateText = async (prompt) => {
try {
const completion = await openai.createCompletion({
model: ‘text-davinci-003’, prompt,temperature: 1, max_tokens: 4048,
});
setIsLoading(false)
setAIResults(completion.data.choices[0].text)
} catch (error) {
console.log(error)
setIsLoading(false)
}
}

const initializePrompt = useCallback(() => {
    const prompt = `Generate a ${type} based on ${summary}`
    setIsLoading(true)
    generateText(prompt)
})

Please help. Everything was okay till now

I have tried to replace the way u mentioned, but the same error is coming up.

const completion = await openai.createCompletion({
model: “text-davinci-003”,
prompt: ${prompt},
temperature: 0.7, // Higher values means the model will take more risks.
max_tokens: 256, // The maximum number of tokens to generate in the completion. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).
top_p: 1, // alternative to sampling with temperature, called nucleus sampling
frequency_penalty: 0, // Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.
presence_penalty: 0, // Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.
});

give your values like this in openai.createCompletion

show me something that you’re implementing so that I can help you accordingly

1 Like

I’m getting this error can you please check how can i resolve it. I have deployed the server and before and after also I’m getting this error.

same problem here anyone help!!

facing the same problem any fix ?

A few updates to this - if you’re going to rely on this example please note these two issues -

  1. you will need to use the older version of got (install using ‘npm install got@11.8.3’) since the newer version of got is an ES Module and you cannot “require” it anymore…(see Error [ERR_REQUIRE_ESM]: require() of ES Module not supported | bobbyhadz )
  2. The ‘model’ parameter in the request payload is required for the API to properly route the request to the correct model - so the param would change to include whichever model you plan on using
const params = {
    "model": 'text-davinci-003',
    "prompt": prompt,
    "max_tokens": 160,
    "temperature": 0.7,
    "frequency_penalty": 0.5
  };

hey i am getting same 401 error, did you solved?

Yes - the most common reason for this is that the API key is not correct or is not being set appropriately. To see if this is the issue test this by assigning you API key directly (without using any loading package) t the API key variable. If it works, your issue is with how you are loading the key.
Also, be sure to include the model name (like I show in my previous response).

This commonly occurs when you are using code generated by ChatGPT - try using code generated through the Playground its more current or better yet the doc has great examples of the most current code required.

BTW another common issue is that the completion URL is being set wrong - the correct one is https://api.openai.com/v1/completions

1 Like

Hey all! You might want to check out the new error codes guide which gives suggestions on how to mitigate pretty much all error codes: OpenAI API

1 Like
require('dotenv').config();

const { Client, GatewayIntents, GatewayIntentBits } = require('discord.js');
const client = new Client({
  intents: [
    GatewayIntentBits.Guilds,
    GatewayIntentBits.GuildMessages,
    GatewayIntentBits.MessageContent,
  ],
});

const headers = {
    'Authorization': `Bearer ${process.env.OPENAI_SECRET_KEY}`,
  };
const { Configuration, OpenAIApi } = require('openai');
const configuration = new Configuration({
    organization: process.env.OPENAI_ORG,
    apikey: process.env.OPENAI_KEY,
});
const openai = new OpenAIApi(configuration);

client.on('messageCreate', async function (message) {
  try {
    // Don't respond to yourself or any other bots
    if (message.author.bot) return;

    const gbtResponse = await openai.createCompletion({
      model: 'davinci',
      prompt: `ChatGBT is a friendly chatbot.\n\
        ChatGBT: Hello, How are you? \n\
        ${message.author.username}: ${message.content} \n\
        ChatGBT:`,
      max_tokens: 100,
      stop: ['ChatGBT:', 'Arc Sensei', 'Arc', 'Sensei'],
    });

    message.reply(`${gbtResponse.data.choices[0].text}`);
    return;
  } catch (err) {
    console.log(err);
  }
});

client.login(process.env.DISCORD_TOKEN);
console.log('Chat GPT bot is Online on Discord');

this is the code

Error: Request failed with status code 401
    at createError (D:\ChatGBT-discord-bot\node_modules\axios\lib\core\createError.js:16:15)
    at settle (D:\ChatGBT-discord-bot\node_modules\axios\lib\core\settle.js:17:12)
    at IncomingMessage.handleStreamEnd (D:\ChatGBT-discord-bot\node_modules\axios\lib\adapters\http.js:322:11)
    at IncomingMessage.emit (node:events:525:35)
    at endReadableNT (node:internal/streams/readable:1359:12)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
    adapter: [Function: httpAdapter],
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    validateStatus: [Function: validateStatus],
    headers: {
      Accept: 'application/json, text/plain, */*',
      'Content-Type': 'application/json',
      'User-Agent': 'OpenAI/NodeJS/3.2.1',
      Authorization: 'Bearer undefined',
      'OpenAI-Organization': 'org-InOyxHkSJJpGNVwSzaYlMRp0',
      'Content-Length': 263
    },
    method: 'post',
    data: '{"model":"davinci","prompt":"ChatGBT is a friendly chatbot.\\n        ChatGBT: Hello, How are you? \\n        Arc Sensei: hello there, what is an interesting fact about web design \\n        ChatGBT:","max_tokens":100,"stop":["ChatGBT:","Arc Sensei","Arc","Sensei"]}',
    url: 'https://api.openai.com/v1/completions'
  },
  request: <ref *1> ClientRequest {
    _events: [Object: null prototype] {
      abort: [Function (anonymous)],
      aborted: [Function (anonymous)],
      connect: [Function (anonymous)],
      error: [Function (anonymous)],
      socket: [Function (anonymous)],
      timeout: [Function (anonymous)],
      finish: [Function: requestOnFinish]
    },
    _eventsCount: 7,
    _maxListeners: undefined,
    outputData: [],
    outputSize: 0,
    writable: true,
    destroyed: false,
    _last: true,
    chunkedEncoding: false,
    shouldKeepAlive: false,
    maxRequestsOnConnectionReached: false,
    _defaultKeepAlive: true,
    useChunkedEncodingByDefault: true,
    sendDate: false,
    _removedConnection: false,
    _removedContLen: false,
    _removedTE: false,
    strictContentLength: false,
    _contentLength: 263,
    _hasBody: true,
    _trailer: '',
    finished: true,
    _headerSent: true,
    _closed: false,
    socket: TLSSocket {
      _tlsOptions: [Object],
      _secureEstablished: true,
      _securePending: false,
      _newSessionPending: false,
      _controlReleased: true,
      secureConnecting: false,
      _SNICallback: null,
      servername: 'api.openai.com',
      alpnProtocol: false,
      authorized: true,
      authorizationError: null,
      encrypted: true,
      _events: [Object: null prototype],
      _eventsCount: 10,
      connecting: false,
      _hadError: false,
      _parent: null,
      _host: 'api.openai.com',
      _closeAfterHandlingError: false,
      _readableState: [ReadableState],
      _maxListeners: undefined,
      _writableState: [WritableState],
      allowHalfOpen: false,
      _sockname: null,
      _pendingData: null,
      _pendingEncoding: '',
      server: undefined,
      _server: null,
      ssl: [TLSWrap],
      _requestCert: true,
      _rejectUnauthorized: true,
      parser: null,
      _httpMessage: [Circular *1],
      [Symbol(res)]: [TLSWrap],
      [Symbol(verified)]: true,
      [Symbol(pendingSession)]: null,
      [Symbol(async_id_symbol)]: 145,
      [Symbol(kHandle)]: [TLSWrap],
      [Symbol(lastWriteQueueSize)]: 0,
      [Symbol(timeout)]: null,
      [Symbol(kBuffer)]: null,
      [Symbol(kBufferCb)]: null,
      [Symbol(kBufferGen)]: null,
      [Symbol(kCapture)]: false,
      [Symbol(kSetNoDelay)]: false,
      [Symbol(kSetKeepAlive)]: true,
      [Symbol(kSetKeepAliveInitialDelay)]: 60,
      [Symbol(kBytesRead)]: 0,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(connect-options)]: [Object]
    },
    _header: 'POST /v1/completions HTTP/1.1\r\n' +
      'Accept: application/json, text/plain, */*\r\n' +
      'Content-Type: application/json\r\n' +
      'User-Agent: OpenAI/NodeJS/3.2.1\r\n' +
      'Authorization: Bearer undefined\r\n' +
      'OpenAI-Organization: org-InOyxHkSJJpGNVwSzaYlMRp0\r\n' +
      'Content-Length: 263\r\n' +
      'Host: api.openai.com\r\n' +
      'Connection: close\r\n' +
      '\r\n',
    _keepAliveTimeout: 0,
    _onPendingData: [Function: nop],
    agent: Agent {
      _events: [Object: null prototype],
      _eventsCount: 2,
      _maxListeners: undefined,
      defaultPort: 443,
      protocol: 'https:',
      options: [Object: null prototype],
      requests: [Object: null prototype] {},
      sockets: [Object: null prototype],
      freeSockets: [Object: null prototype] {},
      keepAliveMsecs: 1000,
      keepAlive: false,
      maxSockets: Infinity,
      maxFreeSockets: 256,
      scheduling: 'lifo',
      maxTotalSockets: Infinity,
      totalSocketCount: 1,
      maxCachedSessions: 100,
      _sessionCache: [Object],
      [Symbol(kCapture)]: false
    },
    socketPath: undefined,
    method: 'POST',
    maxHeaderSize: undefined,
    insecureHTTPParser: undefined,
    joinDuplicateHeaders: undefined,
    path: '/v1/completions',
    _ended: true,
    res: IncomingMessage {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 4,
      _maxListeners: undefined,
      socket: [TLSSocket],
      httpVersionMajor: 1,
      httpVersionMinor: 1,
      httpVersion: '1.1',
      complete: true,
      rawHeaders: [Array],
      rawTrailers: [],
      joinDuplicateHeaders: undefined,
      aborted: false,
      upgrade: false,
      url: '',
      method: null,
      statusCode: 401,
      statusMessage: 'Unauthorized',
      client: [TLSSocket],
      _consuming: false,
      _dumped: false,
      req: [Circular *1],
      responseUrl: 'https://api.openai.com/v1/completions',
      redirects: [],
      [Symbol(kCapture)]: false,
      [Symbol(kHeaders)]: [Object],
      [Symbol(kHeadersCount)]: 22,
      [Symbol(kTrailers)]: null,
      [Symbol(kTrailersCount)]: 0
    },
    aborted: false,
    timeoutCb: null,
    upgradeOrConnect: false,
    parser: null,
    maxHeadersCount: null,
    reusedSocket: false,
    host: 'api.openai.com',
    protocol: 'https:',
    _redirectable: Writable {
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 3,
      _maxListeners: undefined,
      _options: [Object],
      _ended: true,
      _ending: true,
      _redirectCount: 0,
      _redirects: [],
      _requestBodyLength: 263,
      _requestBodyBuffers: [],
      _onNativeResponse: [Function (anonymous)],
      _currentRequest: [Circular *1],
      _currentUrl: 'https://api.openai.com/v1/completions',
      [Symbol(kCapture)]: false
    },
    [Symbol(kCapture)]: false,
    [Symbol(kBytesWritten)]: 0,
    [Symbol(kNeedDrain)]: false,
    [Symbol(corked)]: 0,
    [Symbol(kOutHeaders)]: [Object: null prototype] {
      accept: [Array],
      'content-type': [Array],
      'user-agent': [Array],
      authorization: [Array],
      'openai-organization': [Array],
      'content-length': [Array],
      host: [Array]
    },
    [Symbol(errored)]: null,
    [Symbol(kUniqueHeaders)]: null
  },
  response: {
    status: 401,
    statusText: 'Unauthorized',
    headers: {
      date: 'Tue, 30 May 2023 19:47:23 GMT',
      'content-type': 'application/json; charset=utf-8',
      'content-length': '146',
      connection: 'close',
      vary: 'Origin',
      'x-request-id': 'a3f33e2ed2293c6c28c435ec1c457a6a',
      'strict-transport-security': 'max-age=15724800; includeSubDomains',
      'cf-cache-status': 'DYNAMIC',
      server: 'cloudflare',
      'cf-ray': '7cf97d188a019b82-FRA',
      'alt-svc': 'h3=":443"; ma=86400'
    },
    config: {
      transitional: [Object],
      adapter: [Function: httpAdapter],
      transformRequest: [Array],
      transformResponse: [Array],
      timeout: 0,
      xsrfCookieName: 'XSRF-TOKEN',
      xsrfHeaderName: 'X-XSRF-TOKEN',
      maxContentLength: -1,
      maxBodyLength: -1,
      validateStatus: [Function: validateStatus],
      headers: [Object],
      method: 'post',
      data: '{"model":"davinci","prompt":"ChatGBT is a friendly chatbot.\\n        ChatGBT: Hello, How are you? \\n        Arc Sensei: hello there, what is an interesting fact about web design \\n        ChatGBT:","max_tokens":100,"stop":["ChatGBT:","Arc Sensei","Arc","Sensei"]}',
      url: 'https://api.openai.com/v1/completions'
    },
    request: <ref *1> ClientRequest {
      _events: [Object: null prototype],
      _eventsCount: 7,
      _maxListeners: undefined,
      outputData: [],
      outputSize: 0,
      writable: true,
      destroyed: false,
      _last: true,
      chunkedEncoding: false,
      shouldKeepAlive: false,
      maxRequestsOnConnectionReached: false,
      _defaultKeepAlive: true,
      useChunkedEncodingByDefault: true,
      sendDate: false,
      _removedConnection: false,
      _removedContLen: false,
      _removedTE: false,
      strictContentLength: false,
      _contentLength: 263,
      _hasBody: true,
      _trailer: '',
      finished: true,
      _headerSent: true,
      _closed: false,
      socket: [TLSSocket],
      _header: 'POST /v1/completions HTTP/1.1\r\n' +
        'Accept: application/json, text/plain, */*\r\n' +
        'Content-Type: application/json\r\n' +
        'User-Agent: OpenAI/NodeJS/3.2.1\r\n' +
        'Authorization: Bearer undefined\r\n' +
        'OpenAI-Organization: org-InOyxHkSJJpGNVwSzaYlMRp0\r\n' +
        'Content-Length: 263\r\n' +
        'Host: api.openai.com\r\n' +
        'Connection: close\r\n' +
        '\r\n',
      _keepAliveTimeout: 0,
      _onPendingData: [Function: nop],
      agent: [Agent],
      socketPath: undefined,
      method: 'POST',
      maxHeaderSize: undefined,
      insecureHTTPParser: undefined,
      joinDuplicateHeaders: undefined,
      path: '/v1/completions',
      _ended: true,
      res: [IncomingMessage],
      aborted: false,
      timeoutCb: null,
      upgradeOrConnect: false,
      parser: null,
      maxHeadersCount: null,
      reusedSocket: false,
      host: 'api.openai.com',
      protocol: 'https:',
      _redirectable: [Writable],
      [Symbol(kCapture)]: false,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(kNeedDrain)]: false,
      [Symbol(corked)]: 0,
      [Symbol(kOutHeaders)]: [Object: null prototype],
      [Symbol(errored)]: null,
      [Symbol(kUniqueHeaders)]: null
    },
    data: { error: [Object] }
  },
  isAxiosError: true,
  toJSON: [Function: toJSON]

this is the error

The code is supposed to reply to messages sent from discord pulled from the openai API but im not sure why i keep getting the 401 error code, anyone got any ideas?

You’ve referenced process.env.OPENAI_SECRET_KEY and process.env.OPENAI_KEY. Which one is it? And are you sure (100% sure) that it is set? 401 means missing or bad API key. Can also try generating a new key at setting that.

Error confirms it is not set:

'Authorization: Bearer undefined\r\n' +
1 Like

yeah I would agree… It’s likely one or the other not both. If you’re using the OpenAI client library you shouldn’t need to pass in a custom authorization header. This issue is more likely that your key is in process.env.OPENAI_SECRET_KEY and not process.env.OPENAI_KEY. Just want to make sure you’re using the correct solution here so that others who hit this don’t get confused.

I have renewed the discord key and the OpenAI API key and the error is still recurring. I also removed the secret key line and no luck. Could the problem be in the axios module, Ive seen some posts talking about that.

Do you have the correct key set in your environment variables as OPENAI_KEY?

In the stacktrace you can search for Bearer and see what value is being sent, it was undefined in your previous post.