Thank you
But … when I invoke json decode it just freaks out.
The raw response is below
First one
string(22) “FROM Artic Fox: Hello”
string(429) "{
“id”: “chatcmpl-7mUNsUlRg0mDcEVM2T4zl7fAQLDLh”,
“object”: “chat.completion”,
“created”: 1691790896,
“model”: “gpt-3.5-turbo-0613”,
“choices”: [
{
“index”: 0,
“message”: {
“role”: “assistant”,
“content”: “Hello! How can I assist you today?”
},
“finish_reason”: “stop”
}
],
“usage”: {
“prompt_tokens”: 21,
“completion_tokens”: 9,
“total_tokens”: 30
}
}
"
array(6) {
[“id”]=>
string(38) “chatcmpl-7mUNsUlRg0mDcEVM2T4zl7fAQLDLh”
[“object”]=>
string(15) “chat.completion”
[“created”]=>
int(1691790896)
[“model”]=>
string(18) “gpt-3.5-turbo-0613”
[“choices”]=>
array(1) {
[0]=>
array(3) {
[“index”]=>
int(0)
[“message”]=>
array(2) {
[“role”]=>
string(9) “assistant”
[“content”]=>
string(34) “Hello! How can I assist you today?”
}
[“finish_reason”]=>
string(4) “stop”
}
}
[“usage”]=>
array(3) {
[“prompt_tokens”]=>
int(21)
[“completion_tokens”]=>
int(9)
[“total_tokens”]=>
int(30)
}
}
NULL
Second server :
API Response:
string(1253) "From KL0XL TalkeetnaFrom KC4PTI Arctic Fox:string(22) “FROM Artic Fox: Hello” string(429) "{ “id”: “chatcmpl-7mUQNoVQBmztU1W2dUqxhADiQCdol”, “object”: “chat.completion”, “created”: 1691791051, “model”: “gpt-3.5-turbo-0613”, “choices”: [ { “index”: 0, “message”: { “role”: “assistant”, “content”: “Hello! How can I assist you today?” }, “finish_reason”: “stop” } ], “usage”: { “prompt_tokens”: 21, “completion_tokens”: 9, “total_tokens”: 30 } } " array(6) { [“id”]=> string(38) “chatcmpl-7mUQNoVQBmztU1W2dUqxhADiQCdol” [“object”]=> string(15) “chat.completion” [“created”]=> int(1691791051) [“model”]=> string(18) “gpt-3.5-turbo-0613” [“choices”]=> array(1) { [0]=> array(3) { [“index”]=> int(0) [“message”]=> array(2) { [“role”]=> string(9) “assistant” [“content”]=> string(34) “Hello! How can I assist you today?” } [“finish_reason”]=> string(4) “stop” } } [“usage”]=> array(3) { [“prompt_tokens”]=> int(21) [“completion_tokens”]=> int(9) [“total_tokens”]=> int(30) } } NULL " string(1070) "{ “id”: “chatcmpl-7mUQSGkF5S41TRyOUmRzGk2fDjoiz”, “object”: “chat.completion”, “created”: 1691791056, “model”: “gpt-4-0613”, “choices”: [ { “index”: 0, “message”: { “role”: “assistant”, “content”: “The output from the chat model outputs a greeting "Hello! How can I assist you today?". It is wrapped in a data structure, which contains not only the outputted message but also some metadata about the message. It includes the id of the chat completion, the object type, when it was created, the model that created it, why the response was ended (because it reached a stopping point), and also some details about the usage of tokens (which are elements of text that the model reads or generates). The token information is available as both individual counts for the prompt and the completion, and as a total. Lastly, it shows a null result because no further data was returned.” }, “finish_reason”: “stop” } ], “usage”: { “prompt_tokens”: 448, “completion_tokens”: 138, “total_tokens”: 586 } } " array(6) { [“id”]=> string(38) “chatcmpl-7mUQSGkF5S41TRyOUmRzGk2fDjoiz” [“object”]=> string(15) “chat.completion” [“created”]=> int(1691791056) [“model”]=> string(10) “gpt-4-0613” [“choices”]=> array(1) { [0]=> array(3) { [“index”]=> int(0) [“message”]=> array(2) { [“role”]=> string(9) “assistant” [“content”]=> string(677) “The output from the chat model outputs a greeting “Hello! How can I assist you today?”. It is wrapped in a data structure, which contains not only the outputted message but also some metadata about the message. It includes the id of the chat completion, the object type, when it was created, the model that created it, why the response was ended (because it reached a stopping point), and also some details about the usage of tokens (which are elements of text that the model reads or generates). The token information is available as both individual counts for the prompt and the completion, and as a total. Lastly, it shows a null result because no further data was returned.” } [“finish_reason”]=> string(4) “stop” } } [“usage”]=> array(3) { [“prompt_tokens”]=> int(448) [“completion_tokens”]=> int(138) [“total_tokens”]=> int(586) } } NULL
Json decode just throws a fit and trashes the page output