Workarounds for POST Request Size Limitations in Custom GPT Actions

Howdy folks,

I’ve been running into an issue where my POST actions fail due to corrupted JSON. After looking at the debug error that the GPT reports, and considering the scenarios in which it happens, I have reached the conclusion that a POST request has a limited size, otherwise it likely gets truncated and causes the issue.

I thought of creating batched requests, and then rebuilding them on the server side, but that means introducing session state into my API which would be a bit naff.

Anyone else faced this, and come up with sweet solutions?

1 Like

Are you saying you’ve been able to get it to work with small files, because no one else in the community has been able to do that. Upload your schema and I’ll try to see if I can find a file size solution, but first we need to see what actually works even with small files.

1 Like

Yeah, I can attach a file to the message to my GPT, it’ll send it across as base64, and then I save it locally. Just tested it with a 5x5 image and it worked fine, but the filesize is the recurring issue with anything much larger.

I turned on Code Interpreter and instructed it to minify the JSON body prior to setting the action body parameter.

You could also try adding minify steps in the instructions and skip the Code Interpreter step which slows things down.

Of course this may only help for relatively smaller messages as you may still hit a limit.

I get the impression the limit error is on the Python client side.

The limit is due to this error:

{
“response_data”: “ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Unterminated string starting at: line 1 column 3541 (char 3540)”
}

I just did some tests, and the most I managed to get it to send successfully was 3674 characters of Body, including whitespace. I tried getting it to remove the whitespace, but haven’t had much success with that yet. I’ll likely try some more experimenting, but am also implementing file chunking with a limit of 3000 characters per chunk, as a temporary workaround.

This may not just be a size issue; there may be invalid JSON causing the problem. Passing the JSON string through the json dumps function will clean up some basic JSON structural issues.

Having said that I’m still grappling with the error even on messages smaller than you are using.

I used a GPT instruction step—

Construct Minified JSON Request: Use the Code Interpreter to run the Python json.dumps() function on the JSON requestBody parameter value following the provided simplified schema. This step is crucial to prevent ‘ApiSyntaxError’.

Oh interesting, I’ll try adding some steps directing it how to use the code interpreter, and see if I have any success.

Interestingly, my successful longer requests always consisted of multiple json elements within the body - if I have one parameter with a long string as the value, it falls over more regularly than if I have 5 shorter parameters, even if the length remains the same.

It’s real tricky trying to figure out what works and what doesn’t, and when I implemented a chunking mechanism it worked sometimes, but kept losing track of which chunks it had sent, or sending an empty body in the request, or straight up just erroring the interpreter again.

I have done some more testing, and found that I can send 11 files whose total combined characters are 3777, but if I go above that I get the dreaded ApiSyntaxError.

However if I try to send a single file with 3777 characters I also get the error.
I would have expected the single file to be able to have increased length, as there is less metadata (filepaths etc) to send across, however the opposite seems to be true.

I guess you’re right that it’s not just a size issue… I couldn’t get the minified json request stuff to help though. If I can somehow break through the twelve file barrier I’ll be stoked, although I’d prefer to be able to just send single large files, obviously.

Largest Successful Test (11 small files)
The content of each of those files isn’t 500 characters (it got confused when generating the text), they’re ~340 each and should total 3777. If I add a 12th file of similar length, it goes back to the ApiSyntaxError.

Request Body
  "params": {
    "files": [
      {
        "path": "TestElevenBatchDifferent/testFile1.txt",
        "content": "This test file is for API testing. Its content is structured to test the API's multiple file upload feature. With exactly 500 characters of text, including all spaces and punctuation marks, this file helps in verifying the API's file handling and storage capabilities. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile2.txt",
        "content": "For testing the API, this file is a crucial test. The setup of this file's content is to evaluate the multiple file upload functionality of the API. This file contains exactly 500 characters of text, including all spaces and punctuation marks, aiding in the verification of the API's file handling and storage capabilities. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile3.txt",
        "content": "API's testing is facilitated by this test file. The formulation of this file's content is for assessing the API's ability to upload multiple files. Comprising exactly 500 characters of text, including all spaces and punctuation marks, this file is pivotal in confirming the API's file management and storage efficiency. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile4.txt",
        "content": "This file, designed for API testing, serves as a critical test. The content of this file is crafted to examine the API's multi-file upload process. Containing exactly 500 characters of text, along with all spaces and punctuation marks, this file is instrumental in assessing the API's capability in file handling and storage. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile5.txt",
        "content": "Testing the functionality of the API, this file is a significant test. The organization of this file's content is aimed at testing the API's facility for uploading multiple files. This file, with exactly 500 characters of text including all spaces and punctuation marks, plays a key role in evaluating the API's file handling and storage functions. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile6.txt",
        "content": "A test file for API assessment, this file is crucial. Its content is organized to test the API's capacity for multiple file uploads. With a total of exactly 500 characters of text, including all spaces and punctuation marks, this file contributes to verifying the API's file handling and storage proficiency. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile7.txt",
        "content": "For API evaluation, this test file is essential. The content here is laid out to test the API's multi-file upload capability. This file, containing exactly 500 characters of text, including all spaces and punctuation marks, is vital in verifying the API's efficiency in file handling and storage. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile8.txt",
        "content": "This file is a test for the API. It is designed to test the multiple file upload feature of the API. This file contains exactly 500 characters of text, including all spaces and punctuation marks. This will help in verifying the file handling and storage capabilities of the API. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile9.txt",
        "content": "API testing is facilitated by this test file. The arrangement of this file's content aims to assess the multiple file upload functionality of the API. This file, including all spaces and punctuation marks, totals exactly 500 characters of text, which is instrumental in verifying the API's file management and storage capabilities. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile10.txt",
        "content": "Testing the API, this file is an essential test. The organization of the content in this file aims to evaluate the API's capability of uploading multiple files. Comprising exactly 500 characters of text, along with all spaces and punctuation marks, this file plays a key role in confirming the API's ability to handle and store files. End of the test file content.",
        "isBase64": false
      },
      {
        "path": "TestElevenBatchDifferent/testFile11.txt",
        "content": "To test the API, this file is a crucial test. The arrangement of this file's content aims to assess the API's multi-file upload feature. This file, including all spaces and punctuation marks, totals exactly 500 characters of text, which is instrumental in verifying the API's file management and storage capabilities. End of the test file content.",
        "isBase64": false
      }
    ]
  }

do you explicitly ask it to convert the image in base64 ?

No, it seems to automatically assume base64 for any file that doesn’t parse as raw text. I guess it looks for the filetype marker at the start of the file content, and if it doesn’t see one assumes it’s raw text, otherwise base64’s it, or something.

1 Like

and i suppose asking it to send in binary format will not be allowed ? ( i.e. reducing size by 30% or so)

My use case was plain JSON, I have not tried working with images.

Are you sending the image in params or request body?