Please help me with this error, I am stuck at this error since last 3 days { “error”: { “message”: “You didn’t provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you’re accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.”, “type”: “invalid_request_error”, “param”: null, “code”: null } }
The API acts like you are not sending it a value for the API key.
See if that is true:
URI url = new URI(API_URL);
HttpURLConnection connection = (HttpURLConnection)
url.toURL().openConnection();
// Validate API_KEY
if (API_KEY == null || API_KEY.isEmpty() || !API_KEY.startsWith("sk-")) {
System.out.println("Invalid API_KEY: " + API_KEY);
throw new IllegalArgumentException("API_KEY is invalid or not set correctly.");
}
// Set request method to POST
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/json;charset=utf-8");
connection.setRequestProperty("Authorization", "Bearer " + API_KEY);
connection.setDoOutput(true);
// Send the request
..
Then I’m not sure, but ponder the result if you move openConnection(); to after setting the connection method and properties…
Thank you for your time , I have included the validate_api_key part code , but it is still showing the same error. However the below code works perfectly, I think the issue is with image part , Since the length of base64image is 354848 , so can it be token exceed issue (though it does not show any error like that) ?
String jsonInputString2 = “{\n” +
" "model": "gpt-4-vision-preview",\n" +
" "messages": [\n" +
" {\n" +
" "role": "system",\n" +
" "content": "You are a helpful assistant."\n" +
" },\n" +
" {\n" +
" "role": "user",\n" +
" "content": "“+command+”"\n" +
" }\n" +
" ]\n" +
“}”;
While the binary file you send may be large, as long as you use the proper “content” message format to send it, it will only count for between 85 - 1445 tokens depending on the quality and size you specify.
If you made a mistake and sent garbage text that was too long, the API would return a different error, that you exceeded the model’s context length or your usage limit, both with the message having a token count.
Here’s a link to a post showing how the body of a message should be sent for base64.
I’ll let you decipher what your code is doing between all the escaping.