Recently, I’ve been experimenting with a super cool idea: integrating Gemini Pro Vision into GPTs via Actions.
While testing the schema, I encountered a phenomenon. The API requires the image data to be provided in the ‘data’ parameter as Base64 encoded data, leading to significantly large POST requests, which in turn triggers the ‘Stopped talking to’ issue.
So getting ChatGPT to output long string dumps similar to how a data image is structured- is one of the methods folks are using to get OpenAI to show some of its training data. They have been locking down things behind the scenes, but have been a little vague on limits while they are tweaking things. I have not been able to find any officially published data with express max limits, but people’s speculations.
Well, consider yourself lucky. You got an error message.
For me, the message is “Talked to…”, then it hangs there forever. That is definitely a showstopper for the upcoming GPT Store launch.
I do hope they thoroughly review that area and offer some guidelines.
I might as well point out that problem does not even have anything to do with AI expertise. It is in the realm of traditional web service/JavaScript programming, etc.
I suggest OpenAI hire more traditional DevOps people to strengthen the service delivery side of their platform.
I have the same issue. I can’t POST long lists of data - maybe at about 100 lines I run into issues. My API works fine from Postman.
Originally it was the “Stopped talking to” error.
However, more recently, it’s giving me:
{
“response_data”: “ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Unterminated string starting at: line 1 column 1717 (char 1716) url=my/api/location”
}
The thing is, it’s never actually hitting my API when the POST value is too long. My logs confirm this.
The entire process, even when it works, is also unbearably slow.
Your problem is of a different nature. There is an easy fix for that.
ChatGPT is clearly telling you the problem is the “string starting at: line 1 column 1717…”
ChatGPT packs your payload together with the system info into a big JSON object. Some characters of your payload may make the final package not satisfy the JSON syntax.
Many solutions are available. For example, you can JSON.stringify your payload. In the worst case scenario, you may base64 it.
The data is still inside the ChatGPT and cannot come out yet.
The ChatGPT is not able to populate the payload into the “params”: {}, which is totally empty, as you can see in the endpoint request. Nobody but ChatGPT can write data into “params”.
This helped me. I had to ask gpt to escape characters that would interfere with passing the data (sh script) in a json request. Then it worked like a charm!