O1 model TypeError: network error

Hello,
Starting today, I’ve begun receiving “TypeError: network error” errors with the O1 model I’m working with when connecting via API. Also, besides that, it just keeps writing “Thinking…” and some suggestions. I’ll do it, I’m calculating, I’m planning…"
And nothing more. Is anyone else experiencing this problem?

Thank you,
Kind regards

1 Like

Yep, the output does not contain a “message” type with content, only “reasoning”.

I’ll head off “support” wasting your time with wanting request IDs when the whole model isn’t working right, here they are.

model=“o1-2024-12-17”

‘x-request-id’: ‘req_90f98568ccaa4bfc942a4f38d90f96f9’

  • easy question
{
  "id": "rs_0282f38d1e72994f01698e7d25ebc88197bfd7eee3f3ab1944",
  "summary": [],
  "type": "reasoning",
  "content": null,
  "encrypted_content": null,
  "status": null
}

req_d358c129588d42a69a4d49673c66d80c

  • thinking question
{
  "id": "rs_0678e6b3bcd92f1c01698e7d561cd88197a65bd4154dd68b71",
  "summary": [
    {
      "text": "**Exploring banana safety**\n\nThe user\u2019s question about whether a falling banana could kill someone is intriguing! The answer seems to be that while it's technically possible in extreme circumstances, it's very improbable. I'd like to clarify if they're asking literally or rhetorically. Factors like fall height, speed, and impact force matter. A typical banana isn't large or heavy enough to be lethal. Overall, it's generally unlikely for a banana to cause serious harm!",
      "type": "summary_text"
    }
  ],
  "type": "reasoning",
  "content": null,
  "encrypted_content": null,
  "status": null
}

Same issue with o3-mini.

“error”: null, “incomplete_details”: null,
The API is echoing back “parallel_tool_calls”: true, which should not work, but setting false did not change anything.

Another sign it is messed up, although the thinking is about the input:

“usage”: {“input_tokens”: 0, “input_tokens_details”: {“cached_tokens”: 0}, “output_tokens”: 256, “output_tokens_details”: {“reasoning_tokens”: 256}, “total_tokens”: 256}

Chat Completions works.

Hi! Is this still an issue?

2 Likes

Hello,
Problem is fixed.
Thank you for care.

1 Like