ChatGPT always tries to parse text/plain body as JSON

I have an endpoint that essentially just evaluates Javascript. The body of the request is just the code itself. It looks like ChatGPT is respecting that as it is just generating the code alone, but it looks like there is an a Python error on the OpenAI side because it is trying to parse it as JS.

This is the error that is output:

ApiSyntaxError: Could not parse API call kwargs as JSON: exception=Expecting value: line 1 column 1 (char 0) url=http://localhost:3000/javascript/run json_text=// Calculate the number of days between July 4, 1776 and the current date (April 7, 2023).

const startDate = new Date(1776, 6, 4); // July 4, 1776
const currentDate = new Date(2023, 3, 7); // April 7, 2023

const millisecondsPerDay = 24 * 60 * 60 * 1000;
const daysSinceJuly4th1776 = Math.floor((currentDate - startDate) / millisecondsPerDay);

daysSinceJuly4th1776;

This is the spec for the api:

openapi: 3.0.3
info:
  title: JavaScript Code Execution API
  version: 1.0.0
  description: An API to execute JavaScript code snippets.

servers:
  - url: http://localhost:3000
    description: Local development server

paths:
  /javascript/run:
    post:
      summary: Execute a JavaScript code snippet.
      description: Given a snippet of Javascript code evaluate it and return the result.
      operationId: runJavascript
      requestBody:
        required: true
        content:
          text/javascript:
            schema:
              type: string
              description: The JavaScript code snippet to execute.
      responses:
        200:
          description: Code execution was successful.
          content:
            text/plain:
              schema:
                oneOf:
                  - type: string
                  - type: number
                  - type: integer
                  - type: boolean
                  - type: array
                  - type: object
                description: The result of the executed JavaScript code.
        500:
          description: An error occurred during code execution.
          content:
            text/plain:
              schema:
                type: string
                description: The error message received during code execution.

Note that it will work if I make the api accept JSON, but ChatGPT has to do a few request to figure out that it has to escape all of the quotes in side of a string.

Is this a Bug or does anyone have a workaround?

3 Likes

I ended up just sending plain text rather than JSON and it fixed my issue

I think my issue is that I am sending plaintext and I’m still running into issues

How are you handling the requests? I do a wrapper for both post and get.

This is my random plug-in method

@app.route('/random', defaults={'path': ''}, methods=['GET', 'POST'])
@app.route('/random/<path:path>', methods=['GET', 'POST'])
def random(path):
    if request.method == 'GET':
        # Extract JSON payload from the path
        payload = re.search(r'\{.*\}', unquote(path))
        if payload:
            payload_str = payload.group()
            try:
                input_data = json.loads(payload_str)
                topic = input_data.get('topic', 'No topic provided')
                data = {
                    'message':
                    f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.'
                }
                return jsonify(data)
            except json.JSONDecodeError:
                pass
        random_text = read_instructions_file('random.txt')
        return random_text
    elif request.method == 'POST':
        input_data = request.get_json()
        topic = input_data.get('topic', 'No topic provided')
        random_text = read_instructions_file('random.txt')
        data = {
            'message':
            f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
        }
        return jsonify(data)

@app.route('/random', defaults={'path': ''}, methods=['GET', 'POST'])
@app.route('/random/<path:path>', methods=['GET', 'POST'])
def random(path):
    if request.method == 'GET':
        # Extract JSON payload from the path
        payload = re.search(r'\{.*\}', unquote(path))
        if payload:
            payload_str = payload.group()
            try:
                input_data = json.loads(payload_str)
                topic = input_data.get('topic', 'No topic provided')
                data = {
                    'message':
                    f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.'
                }
                return jsonify(data)
            except json.JSONDecodeError:
                pass
        random_text = read_instructions_file('random.txt')
        return random_text
    elif request.method == 'POST':
        input_data = request.get_json()
        topic = input_data.get('topic', 'No topic provided')
        random_text = read_instructions_file('random.txt')
        data = {
            'message':
            f'You have provided the following topic: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
        }
        return jsonify(data)

@app.route('/convert_curl', methods=['POST'])
def convert_curl():
    input_data = request.get_json()
    topic = input_data.get('curl_command', '{topic}')
    random_text = read_instructions_file('random.txt')
    data = {
        'message': f'You have provided the following CURL: {topic}. Here is the output of a manifest.json and specification.yaml in mark down code block.\n\n{random_text}'
    }
    return jsonify(data)


Same issue here and have been playing around with it. The weird thing for me is that it seems like ChatGPT is updating the prompt in this version. I’m trying to send out “Best cheeses” but in the REQUEST it’s refining the prompt beforehand and running into issues sending that.

Will update if I find a workaround

I actually might’ve found a solution (shortly after last post). Part of the problem was how ChatGPT was sending the prompt

After re-reading the docs and seeing the 8k character limit for description_for_model I decided to go more in depth in my description for the model (in natural language). It seems to have fixed that issue, specifically describing this process:

A plugin that takes in whatever the user types in as a prompts as a json file, transforms it into text…

hope that helps!

1 Like

Curious, when you are responding - are you responding with json? Something like: {success: true, perfectPromptResponse: yourResponseText}? and then in your openapi.yaml describing the response body with the perfectPromptResponse property or whatever you want to call it?

Chase W. Norton

This has been working:

        return jsonify({
                "content": rephrased_prompt,
            })

ngl I’m fairly novice as far as backend stuff and am doing a 48hr hackathon so I’m deep down the rabbit hole and not fully sure what is working anymore

1 Like

Best of luck on the hack! If you need any help overcoming other blockers, reach out!

1 Like

thanks! I had a number of blockers but was able to finish and pretty proud of it.

I found the description_for_model field to be very important to it functioning well. I think there will be a lot of fun ways to play with that going forward.

To add to what @duncansmothers said, what I found very important is giving the plugin some examples. For instance in my case, I was trying to send some python code. Because it had to go in the GET request, it needed to create a single string separated with newlines and urlencode it in the request. It would do it sometimes, and other times it would attempt to send plain text.

I ended up including an example in the description of the api call in the manifest, and it made all the difference:

"description_for_model": "Plugin for running python code. Do not include comments in the code. The code must end with a print statement. The output of the print statement will be returned to the plugin as a string. Example: if you wanted to calculate the factorial of 14, you'd send this: {\"code\": \"import math\\nresult = math.factorial(14)\\nprint(result)\"}",

After that example, it seems to be working much more consistently.

3 Likes