Gpt-3.5-turbo-16k-0613 function returns invalid JSON when using type "number"

I have a function defined that has a property defined as type number.
Sometimes the completion returns thousands separators which makes the reply invalid json according to the schema,

the function has this structure:

example_function = [
      {
          "name":"export_extraction",
          "description":"Sends the extracted data about example to an external example system.",
          "parameters": {
              "type":"object",
              "properties": {
                    "top_node": {
                        "type" : "object",
                        "properties": {
                            "top_node_name": {
                                "type": "string",
                                "description": "The name of the top node."
                            },
                            "top_node_type": {
                                "type": "string",
                                "enum": ["example type 1","example type 2", "example type 3"],
                                "description": "The type of the top node."
                            },
                            "commitment": {
                                    "type" : "object",
                                    "properties": {
                                        "commitment":{
                                            "type": "number",
                                            "description": "some description"
                                        }
                                    },
                                    "description": "some description"
                                },
                        },
                        "description": "A top node."
                    }
              },
              "required": ["top_node"]
          }
      }
]

i get this reply in the function arguments:

'{\n  "top_node": {\n    "top_node_name": "Name B",\n    "top_node_type": "example type 1",\n    "commitment": {\n      "commitment": 89,652,893\n    }\n  }\n}'

since 89,652,893 contains commas as thousands separators it is not a valid JSON number, as the commas are interpreted as boundaries between new properties.

I know i can fix it by defining the type in the function as string, then parsing it myself and hoping no weirdness about different cultures and number formats happens in my source data, but i would prefer, if some more refinement would be spent on the next model iteration, reducing the number of occurences of invalid JSON for numbers.

Thanks and best regards,
Stefan