ChatGPT always tries to parse text/plain body as JSON

I have an endpoint that essentially just evaluates Javascript. The body of the request is just the code itself. It looks like ChatGPT is respecting that as it is just generating the code alone, but it looks like there is an a Python error on the OpenAI side because it is trying to parse it as JS.

This is the error that is output:

ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Expecting value: line 1 column 1 (char 0) url=http://localhost:3000/javascript/run json_text=// Calculate the number of days between July 4, 1776 and the current date (April 7, 2023).

const startDate = new Date(1776, 6, 4); // July 4, 1776
const currentDate = new Date(2023, 3, 7); // April 7, 2023

const millisecondsPerDay = 24 * 60 * 60 * 1000;
const daysSinceJuly4th1776 = Math.floor((currentDate - startDate) / millisecondsPerDay);

daysSinceJuly4th1776;

This is the spec for the api:

openapi: 3.0.3
info:
  title: JavaScript Code Execution API
  version: 1.0.0
  description: An API to execute JavaScript code snippets.

servers:
  - url: http://localhost:3000
    description: Local development server

paths:
  /javascript/run:
    post:
      summary: Execute a JavaScript code snippet.
      description: Given a snippet of Javascript code evaluate it and return the result.
      operationId: runJavascript
      requestBody:
        required: true
        content:
          text/javascript:
            schema:
              type: string
              description: The JavaScript code snippet to execute.
      responses:
        200:
          description: Code execution was successful.
          content:
            text/plain:
              schema:
                oneOf:
                  - type: string
                  - type: number
                  - type: integer
                  - type: boolean
                  - type: array
                  - type: object
                description: The result of the executed JavaScript code.
        500:
          description: An error occurred during code execution.
          content:
            text/plain:
              schema:
                type: string
                description: The error message received during code execution.

Note that it will work if I make the api accept JSON, but ChatGPT has to do a few request to figure out that it has to escape all of the quotes in side of a string.

Is this a Bug or does anyone have a workaround?

4 Likes

I ended up just sending plain text rather than JSON and it fixed my issue

I think my issue is that I am sending plaintext and I’m still running into issues

How are you handling the requests? I do a wrapper for both post and get.

This is my random plug-in method

@app.route('/random', defaults={'path': ''}, methods=['GET', 'POST'])
@app.route('/random/<path:path>', methods=['GET', 'POST'])
def random(path):
    if request.method == 'GET':
        # Extract JSON payload from the path
        payload = re.search(r'\{.*\}', unquote(path))
        if payload:
            payload_str = payload.group()
            try:
                input_data = json.loads(payload_str)
                topic = input_data.get('topic', 'No topic provided')
                data = {
                    'message':
                    f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.'
                }
                return jsonify(data)
            except json.JSONDecodeError:
                pass
        random_text = read_instructions_file('random.txt')
        return random_text
    elif request.method == 'POST':
        input_data = request.get_json()
        topic = input_data.get('topic', 'No topic provided')
        random_text = read_instructions_file('random.txt')
        data = {
            'message':
            f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
        }
        return jsonify(data)

@app.route('/random', defaults={'path': ''}, methods=['GET', 'POST'])
@app.route('/random/<path:path>', methods=['GET', 'POST'])
def random(path):
    if request.method == 'GET':
        # Extract JSON payload from the path
        payload = re.search(r'\{.*\}', unquote(path))
        if payload:
            payload_str = payload.group()
            try:
                input_data = json.loads(payload_str)
                topic = input_data.get('topic', 'No topic provided')
                data = {
                    'message':
                    f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.'
                }
                return jsonify(data)
            except json.JSONDecodeError:
                pass
        random_text = read_instructions_file('random.txt')
        return random_text
    elif request.method == 'POST':
        input_data = request.get_json()
        topic = input_data.get('topic', 'No topic provided')
        random_text = read_instructions_file('random.txt')
        data = {
            'message':
            f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
        }
        return jsonify(data)

@app.route('/convert_curl', methods=['POST'])
def convert_curl():
    input_data = request.get_json()
    topic = input_data.get('curl_command', '{topic}')
    random_text = read_instructions_file('random.txt')
    data = {
        'message': f'You have provided the following CURL: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
    }
    return jsonify(data)


Same issue here and have been playing around with it. The weird thing for me is that it seems like ChatGPT is updating the prompt in this version. I’m trying to send out “Best cheeses” but in the REQUEST it’s refining the prompt beforehand and running into issues sending that.

Will update if I find a workaround

I actually might’ve found a solution (shortly after last post). Part of the problem was how ChatGPT was sending the prompt

After re-reading the docs and seeing the 8k character limit for description_for_model I decided to go more in depth in my description for the model (in natural language). It seems to have fixed that issue, specifically describing this process:

A plugin that takes in whatever the user types in as a prompts as a json file, transforms it into text…

hope that helps!

2 Likes

Curious, when you are responding - are you responding with json? Something like: {success: true, perfectPromptResponse: yourResponseText}? and then in your openapi.yaml describing the response body with the perfectPromptResponse property or whatever you want to call it?

Chase W. Norton

This has been working:

        return jsonify({
                "content": rephrased_prompt,
            })

ngl I’m fairly novice as far as backend stuff and am doing a 48hr hackathon so I’m deep down the rabbit hole and not fully sure what is working anymore

1 Like

Best of luck on the hack! If you need any help overcoming other blockers, reach out!

1 Like

thanks! I had a number of blockers but was able to finish and pretty proud of it.

I found the description_for_model field to be very important to it functioning well. I think there will be a lot of fun ways to play with that going forward.

To add to what @duncansmothers said, what I found very important is giving the plugin some examples. For instance in my case, I was trying to send some python code. Because it had to go in the GET request, it needed to create a single string separated with newlines and urlencode it in the request. It would do it sometimes, and other times it would attempt to send plain text.

I ended up including an example in the description of the api call in the manifest, and it made all the difference:

"description_for_model": "Plugin for running python code. Do not include comments in the code. The code must end with a print statement. The output of the print statement will be returned to the plugin as a string. Example: if you wanted to calculate the factorial of 14, you'd send this: {\"code\": \"import math\\nresult = math.factorial(14)\\nprint(result)\"}",

After that example, it seems to be working much more consistently.

3 Likes

Hi!

@sawyer @ruv @duncansmothers @dbasch Did any of you manage to get ChatGPT post something else than JSON to a ChatGPT plugin? I’m trying to extend my DirReader plugin so that ChatGPT can write files. Since for reading files ChatGPT can deal nicely with a text/plain response with the file content, I thought it could also create a POST with the new file content as text/plain in the request, but all I get in the request is a {} . I hoped the following openapi.yaml would work - in fact the Swagger editor is completely happy with that, but it doesn’t work in ChatGPT:

openapi: 3.0.1
info:
  title: Test ChatGPT Plugin
  description: A plugin that allows the user to inspect a directory and read the contents of files using ChatGPT
  version: 1.0.0
servers:
  - url: http://localhost:3010
paths:
  /messagePlaintext:
    post:
      summary: Sends a message type Plaintext
      operationId: messagePlaintext
      requestBody:
        description: The message to send
        required: true
        content:
          text/plain:
            schema:
              description: Here comes the message
              type: string
      responses:
        '200':
          description: Message received acknowledgement

I tried a couple of other things, but none of that got me something in plain text in the request. Of course, I can also deal with a JSON, but if ChatGPT has to recode the text as JSON that would be wasteful and might even be asking for trouble, right? At least in other contexts, formatting constraints deteriorate it’s reasonling capacity somewhat.

So, could any of you create a request that is not JSON? If not - I guess that’s a bug / missing feature; do you know where I could suggest that?

Thanks a lot!

Hans-Peter

1 Like

:+1:

The suggestion to modify description_for_model with something like transforms it into text… seems to have done the trick with my plugin passing source code.

Still need a few more days of testing to see if stands the test of time but looks promising.

“description_for_model”: “Validate Prolog code generated by ChatGPT using SWI-Prolog and return results, success or errors. Transform Prolog code in prompt to application/json before sending to plugin.”,

1 Like

I haven’t tried before. Not currently developing other plugins but next time i’m in build mode I can try testing it out.

That’s great! Glad it was helpful.

1 Like

This is something i have discovered in my ventures as well.

my solution is to use “shelx”.

No, it doesnt work in some circumstances, but in others… It works very well.