No @raymonddavey
I was replying to @raymonddavey asking you to change or remove params from a validate JSON params file when you were having 400 error and it’s normally not valid params which can be “changed” or “fixed” which will solve a 400 error.
If you search the community using the magnifying glass icon in the top right corner, @Securigy , and search for “400 errors”, you will see that the majority of these errors are caused by either:
- Issues with account credits and limits.
- Issues with malformed JSON params or an invalid API call.
In other words, tweaking params which are valid (and without JSON errors in your params data) will more-than-likely not solve the 400 errors. You need to confirm your account is working, etc and your code is not throwing errors or has malformed params, etc.
HTH

As I mentioned, my requests using GPT-3 API “text-davinci-003” work just fine, so it does not look like hitting the hard limit…
I checked my account to be sure: $12.85 out of $18.00 before they start charge me…
1 Like
I had repeated 400 errors last night getting Whisper to work from the OpenAI Python bindings. So I had to revert back to the Curl example in the docs and worked, and then just made a direct call from Python libraries (pythonized the curl call). Same thing with ChatGPT Turbo, just went straight to code with no bindings (after reverse engineering the Curl call).
For me it’s better to have low dependency overhead anyway. Win-win.
So I have two differences to you:
Put this above your user-agent header
req.Headers.Add("Accept", "application/json");
Also I am not using the ChatCompletionRequest Object. Instead I define the request as
List<OpenAIChat> _chat = new List<OpenAIChat>();
chat.Add(new OpenAIChat() { role = "user", content = "Hello!" });
var request = new { messages = _chat, model = "gpt-3.5-turbo", stream = true };
The definition for the OpenAIChat class is:
public sealed class OpenAIChat
{
public string role { get; set; }
public string content { get; set; }
}
Otherwise our code is the same (you may want to get rid of the sealed part of the class definition)
Yes, OpenAI (Logan) suggested in a earlier post on this same topic to try a basic curl call, like this [you can modify as you like @Securigy (I have not tested it)].
curl https://api.openai.com/v1/chat/completions \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer YOUR_API_KEY' \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role":"user", "content": "How are you?"}],
}'
HTH

Well, curl is not helping me in my application… besides if curl works C# should too…
1 Like
Folks, including @curt.kennedy as he mentioned, use curl to insure there are no underlying API or account problems.
It’s part of basic troubleshooting the API to help debugging.
2 Likes
Just start with curl to rule out account issues. If that works there is a syntax error and you fix it or go straight to code without the library, like I did.
2 Likes
I tried with ‘Accept’ header - same old…
I tried without any parameters, except model, stream, and messages.
From jsonContent string, the ‘messages’ and the whole thing is:
{“messages”:“[{"role":"system","content":"You are an avid story teller. You specialize in real life stories."},{"role":"user","content":"Traveling Canada"}]”,“model”:“gpt-3.5-turbo”,“stream”:true}
So, this is beyond any objects that I use before that… maybe I should use PostAsync…
string jsonContent = JsonConvert.SerializeObject(request, new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore });
var stringContent = new StringContent(jsonContent, UnicodeEncoding.UTF8, "application/json");
string url = String.Format("{0}/chat/completions", Api.BaseUrl);
Log.VerboseFormat("Calling SendAsync with URL: {0}, and Engine: {1}", url, request.Model);
using (HttpRequestMessage req = new HttpRequestMessage(HttpMethod.Post, url))
{
req.Content = stringContent;
req.Headers.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Bearer", Api.Auth.ApiKey);
req.Headers.Add("Accept", "application/json");
req.Headers.Add("User-Agent", "GSS/OpenAI_GPT3");
var response = await client.SendAsync(req, HttpCompletionOption.ResponseHeadersRead);
if (response.IsSuccessStatusCode)
Just a feeling that the API is still half-baked…
I’m fairly sure the issue is the quote before and after the square brackets
It is an array and not a string representations of an array
It should be like this
{"messages":[{"role":"user","content":"Generate an extensive list of subtopics that need to be covered for plumb and the context provided below. Provide the subtopics in list format.\n\n"}],"model":"gpt-3.5-turbo","user":"U2815","stream":true}
Copy and pasted from my app - hence different text
Have a look at my message above to see how to get it as an array (using code similar to this)
List<OpenAIChat> _chat = new List<OpenAIChat>();
chat.Add(new OpenAIChat() { role = "user", content = "Hello!" });
etc…
1 Like
Your observation makes a lot of sense. This is caused by:
string jsonContent = JsonConvert.SerializeObject(request, new JsonSerializerSettings() { NullValueHandling = NullValueHandling.Ignore });
but a real reason is probably that the messages in my object is String:
public String messages { get; set; }
So what should it be just Object ?
It’s actually an array or List (You can use object if you cast the data properly)
But you cant use string
I made a class called OpenAIChat
It has two properties role, and content
Then I made a List of the class with List<OpenAIChat>
Finally, I added OpenAIChat objects to the list and passed this into messages
So messages would be defined as
public List<OpenAIChat> messages { get; set; }
Have a look at the code example a few messages back
3 Likes
Hey @Securigy
We understand your frustration, but the API works fine. I have been using it for the past 24 hours without a hitch, but the difference is I’m coding in Ruby and you are using C#.
This means the issue, as @raymonddavey has been trying to help you, is you have some bug in your C# code.
However, since you will not take one minute and confirm there are no underlying issues using curl as @curt.kennedy and I have suggested, then of course their might be another underlying issue.
Why not just take one minute of your time and try with curl and post back the results to insure there are zero underlying issues?
Thanks

yep, you are absolutely right… I see I already have similar one called Choices in the CompletionResult object.
Great catch!! My deep appreciation!
1 Like
Amending here: although I get Success, the first line that I am reading is this:
data: {“id”:“chatcmpl-6psQJ649lGZpci17HgPLAR3tDkYxS”,“object”:“chat.completion.chunk”,“created”:1677821951,“model”:“gpt-3.5-turbo-0301”,“choices”:[{“delta”:{“role”:“assistant”},“index”:0,“finish_reason”:null}]}
no resemblance to the documentation:
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"choices": [{
"index": 0,
"message": {
"role": "assistant",
"content": "\n\nHello there, how may I assist you today?",
},
"finish_reason": "stop"
}],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}
Because the OpenAI example response you posted is not streaming.
That response is based on stream: false.
Yes the response is correct
They have not documented the response of the new chat API
Here are some class definitions for C# that will help
public class CompletionResultChat
{
/// <summary>
/// The identifier of the result, which may be used during troubleshooting
/// </summary>
[JsonProperty("object")]
public string ObjectType { get; set; }
/// <summary>
/// Which model was used to generate this result. Be sure to check <see cref="Engine.ModelRevision"/> for the specific revision.
/// </summary>
[JsonProperty("model")]
public Engine Model { get; set; }
/// <summary>
/// The Completions returned by the API. Depending on your request, there may be 1 or many data.
/// </summary>
[JsonProperty("choices")]
public List<StreamChat> Choices { get; set; }
}
and this one for the Choices
public class StreamChat
{
/// <summary>
/// The main completion
/// </summary>
[JsonProperty("delta")]
public StreamDelta Delta { get; set; }
/// <summary>
/// Did it finish the completion
/// </summary>
[JsonProperty("finish_reason")]
public string Finish_Reason { get; set; }
/// <summary>
/// If multiple Completion data we returned, this is the index withing the various data
/// </summary>
[JsonProperty("index")]
public int Index { get; set; }
}
and StreamDelta
public class StreamDelta
{
[JsonProperty("content")]
public string Content { get; set; }
}
If the response is “[DONE]”, then the output is complete. This hasn’t changed from the old completions endpoint
Otherwise Deserialize the response into a CompletionresultChat object
CompletionResultChat deserial = JsonConvert.DeserializeObject(line);
The text will be in
deserial.Choices[0].Delta.Content
Edit: For clarity’s sake, “line” is part of the streaming response you are getting back in your ReadLineAsync method
1 Like
Is that a complete response? Or simply what you implemented? I see that “created” is missing in CompletionResultChat…and there is “finish_reason”…and no “id”
It is the minimum to make it work. There are other properties you can include.