Custom GPT Action
Uses Code Interpreter to correctly construct and validate JSON, and is given strict instructions to use the validated JSON to send to the API.
When sending to the custom Actions API, the debug log shows a different, corrupted JSON is being sent that causes the ApiSytnaxError
E.g. Code Interpreters correct JSON snippet
Debug Logs JSON snippet where it fails to close the dataStored object with a closing }
Why is it not sending the JSON that is correctly constructed and validated by the Code Interpreter?
Is it again processing the message to be sent internally and corrupting it?
It seems to get corrupted with nested objects.
Exactly how does an Action internally generate the params{}?
3 Likes
Similar issue here. My API works when the amount of POST data is small. However, when I send 100+ lines of data, I get:
{
"response_data": "ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Unterminated string starting at: line 1 column 1717 (char 1716) url=my/api"
}
My logs don’t even show GPT hitting the API when the POST length is too long. API works fine with Postman.
Same here. Happens when GPT tries to perform an acrion with a longer text in payload.
Error message:
“ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Extra data: line 1 column 1691 (char 1690) url=https://…/api/notes/{noteId}”
Request body:
{
"noteId": "100a3791-562b-48c6-8618-012e638934b5",
"note": " ... "
}
I redacted the note content, but it has 1631 chars. This isn’t the first time I’ve encountered this. It happens rather often when performing actions with “lengthy” payloads.
1 Like
This looks similar to my error in this other thread which also only happens with API calls that include long JSON (can’t link).
1 Like
I’m still running into this issue. With longer payloads, the params is empty {}, and a JSON error occurs in >50% of the tries. Was anyone able to solve this? I tried re-adding the action.
Struggling from days… with 10 characters included in the json.
Anybody found a solution ?
Unfortunately not, it seems the feature is broken and no interest in fixing it.