Error in input stream... all day long

Hi,
Is anyone else having problems with ChatGPT today? For me, it’s been super annoying. In ~9 out of 10 requests, it gets to the end of a long output, such as a web search or code interpreter session and then “Error in input stream”, forcing a complete regen.

I was planning to use it to help me build on the new code-interpreter API, but if the API is as flaky as this, maybe I’ll wait. I also wonder how tokens would be charged in the API for these failed requests? Would the api stream the tokens and then error out near the end, or would the error happen server-side and then bill me for that request?

Anyone played much with the new api?

Thanks!

BTW, I’m not complaining, I know it’s all very beta, I’m just chomping at the bit to get my workflow plugged in with the new tools. And code-interpreter, when it works, it works really well!

6 Likes

I’m encountering the same issues, leading me to question if there’s been another reduction or optimization in the context window, adversely affecting longer chats. This situation could unveil a significant business opportunity to market extended context solutions. As a devoted fan, I’m hesitant to seek coding alternatives elsewhere. Additionally, the incredible slowness is a concern. I foresee scaling up as a formidable challenge, especially with numerous competing priorities in play.

1 Like

I’m having the same issue when working with GPTs. Happens almost every answer.

4 Likes

I know, I really wasn’t exaggerating when I said it’s failing 9/10 times. Even without web search or code-interpreter I am getting this error.

3 Likes

Same issue here. Haven’t been able to get a single, complete answer over the last hour or so.

2 Likes

Also having this issue. Wish they would provide some update.

1 Like

I am with the same issue since the the last big update with the GPT-4 model. Almost all taks what uses bing comes with this error, “Error in input stream”. And found an error in a new conversation saying its was to much text and asked to start a new conversation.

1 Like

It’s been pretty much the same thing all week long. Not one single message. It’s come to a point where I’m using Bard and Claude as an alternative. Considering dropping my API and GPT Plus plans at the end of this month.

Highly frustrating!!
Screenshot 2023-11-28 at 7.46.42 PM

3 Likes

Not “Error in Input Stream,” but Ive been getting a “Network Error” 8 out of 10 times in the past couple of hours. Response generating has also been especially slow today.

Especially frustrating because you can only generate 24 Custom GPT outputs every 3 hours as it is. Hope it’s fixed soon.

I’ve been a subscriber since April and this has been the worst month in terms of incidents so far.

2 Likes

Same has been happening with me too past 3 - 4 hours. I am trying to work on something with lot of tasks pending. It started to throw errors saying first “Error in input stream”, and then later “Message in conversation not found”

2 Likes

Where did you find that information about the limits on GPT usage?

Trial and error, and confirmation from other users. Hit the limit around 20 times so far. It’s usually 24, no matter how short or long the output is (though once I only could get 17.).

Got the same message today in my VERY FIRST QUERY after purchasing access to GPT-4. A bit discouraging…

1 Like

Experiencing the same error and it costs a lot of time and headaches… If it couldn’t be fixed, I’m thinking to switch to other services. Some texts need 10-15x attempts to be able to write it 1x.

2 Likes

Yep, getting this error daily unfortunately - making me look for alternatives

1 Like

I am having this problem for the last days now. I ask a really simple task: to create multiple choice questions for my students to some topics. If I do that in a batch, lets say 10 questions, the quality is like a monkey would have created it. Really really dumb answers which I cannot present my students. For the last days, this result made re do it all the time in a lower quantity.

Today, even if I do it in lower quantity, lets say 2 or 3 at once per topic, I get “error import stream”. I first asked to provide me the MC questions and answer in a table, but now, not even in any format its working. And even if I only want one question: 2error input stream" after some while of waiting time. What’s going on? I checked the server status and it seems fine? Can that be?

Bad thing is, any wrong or missing answer or not completed task counts into the cap.
So if I ask the thing 5 times to complete the task, it gets deducted 5 times from the cap altough the result is incomplete and trash. So I try to get a useful result for an hour and by that, I have to wait three hours afterwards. Very disappointing! I find that really unfair. I really don’t know, why I am paying for this sh and will cancel my subscription.

p.s. sorry for editing it all the time, this is my firt post.

Edit 18.01.:_ again, the same Problem. I just started to work, its 19:29 here in Germany. This thing is not able to create three answers. “error input stream”. Now, at 19:48 the page ist not available at all anymore. Thanks for nothing.
I get the feeling that somebody is using all the capacity of this programm for some hours every day, perhaps the primary service is focused on the performance on the U.S. market. Because, when I go to bed and do my Korean conversation training after midnight, it worked fine two days ago. And als during the day it seems to work fine. Please be honest, open AI: you focus on U.S. costumers and everybody else is degraded?!

Edit 20.01.: over the day the performance was somewhat ok. Chat GPTs anwered my questions in a helpful manner. Now its 5.06 pm and I get crap again. I want 6 multiple choice questions with an pracitcal example (explanation, answers and short description of the background of the question) and I only get a question, no example, altough I told it to write at least 2 sentences in my prompt. The potential answers consist of one word and the explanation is one sentence, altough I told it to be more. It did not even present it in a table, altough I gave minimum 2 promps saying to do so. No, again, how can I work with this shit?

Edit 21.01. Today I cannot even ask to produce a simple text. The system stopped at one of the paragraphs and freezed. Its simply annoying. Yesterday I stopped working with GPT 4.0 and switched to GPT 3.5. instead. It was much more fluentlyin its work with the major pitfall, that the text scanning ability was not as good as it could be with GPT 4.0. But really, someone would be dumb to pay for it right now. 17:17 pm after some hours trying to get something done now als GPT 3.5. is on the same shitty level like GPT 4. I dont know what this is about, I don’t change my prompts. Sometimes it answered instantaniously, sometimes it just did shi…

2 Likes

Same here for 3-4 days now, I can try and regenerate an output 10 times in a row and have an error in input stream error each time and I finally stop trying. That or a network error. Chatgpt is almost unusable. I can’t say they resolve the issue quickly.

It’s really annoying not to be able to work because of unstable quality of the service you pay for.

2 Likes

The quality of reasoning in responses of GPT4 has degraded significantly. It went from abstracting my text, and putting forth analysis, to presenting repetitive word salads of my input, going in circles. Within my own chat, there’s a clear drop in quality.
It started to reply in bullet points, very GPT 3.5 like, I must say.

Not happy with this inconsistency in their service.

3 Likes

OpenAI deployed to Azure with GPT-4 worked fine for weeks, suddenly stopped working in january, answered as usual about 1 in 10 times. Same error 9 out of 10 times:

TypeError {}
 
    columnNumber: 1
    fileName: ""
    lineNumber: 0
    message: "Error in input stream"
    [[Prototype]]: Object

Solved by setting maxTokens to 4096

1 Like

bumpity bump! I thought the team membership was better lol. unstable asf