The examples in the embeddings guide use the tiktoken library to count tokens before a request.
But since I’m creating a serverless extension, it doesn’t seem convenient to bundle complex libraries into it (I could be wrong, I’m not familiar with Node.js libraries).
Is it possible to limit the tokens in the HTTP request itself?
If it’s not possible, considering that I’m still new to programming, how can I include a tokenizer in my extension?
This is the function I use to request the embeddings:
async function OPENAI_Embedding(texts) {
const apiUrl = 'https://api.openai.com/v1/embeddings';
const requestOptions = {
method: 'POST',
headers: {
'Authorization': `Bearer ${OPENAI_API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
'input': texts,
'model': 'text-embedding-ada-002'
}),
};
try {
const response = await fetch(apiUrl, requestOptions);
const data = await response.json();
STORAGE_tokensUsed_add(data.usage.total_tokens);
return data.data;
} catch (error) {
console.error('Error making request to OpenAI EmbedTexts:', error);
return null;
}
}