GPT actions - new "ResponseTooLargeError" failure when handling API response

I’m creating a GPT calling a custom API. All requests to the API fail with a “ResponseTooLargeError” status.

These same calls were working yesterday. The API response doesn’t seem huge to me (< 1000 tokens).

Is this a bug or is there a new limit on the size of the response GPTs can handle ?

1 Like

Seeing that as a common issue right now, thanks for reporting it.

It would be awesome if the actions could take parameters to omit fields on the response to “sanitize” it before being returned. In many cases a lot of the data is redundant.

1 Like

I had a thought that it would be cool to use hardcoded code interpreter scripts for action post-processing here Response Filtering for Actions .
Though the response too large error I get there is probably more legitimate, because the Github API I was using includes a lot of extra data.

Got the same error. Looks like the github action api returns a lot of additional info. another way to analyse via gpt3.5-turbo-16k? or another cheap model with a large context? or use simple xpatch parser

Try to use json.dumps(portrait, separators=(“,”, “:”)) for serizliation. It removes spaces in result json str. Worked for me.