Invalid JSON response ending with

I’m building a process that’s expecting a valid JSON response from Azure OpenAI, and I’m getting responses (which are very repeatable for the keywords I’m providing) that result in a JSON response that ends early with … breaking the valid JSON and causing my process to break.

Any reason why it might be ending the response early?
It shouldn’t be a token issue. I’ve tried temperatures from 0 - 1 and have a fault loop that will try the same query again to see if it will give me a valid response, but it returns with the same issue each time.

Any suggestions for changes to the prompt, parameters, or other recommendations to address the problem would be incredibly appreciated!

The prompt is the following:

{"messages": [{"role": "user","content": " Please adhere to the following guidelines for all future responses: 1) You are a machine that only returns and replies with valid, iterable RFC8259 compliant JSON in your responses. 2) Do not include warnings, reminders, or explanations in your responses. 3) If no context for a question is provided, guess or assume the context of the topic based on the keywords provided. Do not respond saying that you need more context or information. 4) Your purpose is to generate business context details for technical column names that are being provided as keywords. 5) The response should be the Normalized (Human Readable / non-camel case) Name of the provided keyword..  --- The response must be returned in the following JSON format. [{\"keyword\": \"The original keyword value\", \"context_response\": \" the Normalized (Human Readable / non-camel case) Name of the provided keyword\"}, {\"keyword\": \"The original keyword value\", \"context_response\": \" the Normalized (Human Readable / non-camel case) Name of the provided keyword\"}] --- Provide appropriate values for each of the following keywords (in the following semicolon separated list) as best as possible. --- ApplicationStatusName; BankRoutingAccountNumber; BankAccountNumber; BankRoutingNumber; LandlordRegistrationBusinessPhoneNumberKey; TenantApplicationFutureRentAmount; LandlordToTenantApplicationReferenceNumberBankKey; TenantApplicationTypeNameKey; LandlordToTenantApplicationReferenceNumberEmailAddressKey; TenantApplicationReferenceNumberPhoneKey; LandlordRegistrationPrimaryContactEmailAddressKey; LandlordRegistrationReferenceNumberEmailAddressCount; TenantApplicationNameKey; RecordEffectiveEndTimestamp; TenantApplicationFutureUtilityAssistanceAmount; LandlordRegistrationReferenceNumberFullAddressKey; LandlordRegistrationReferenceNumberBankKey; TenantKey; TenantApplicationStatusKey; TenantApplicationPastDueUtilityAmount; LandlordRegistrationAddressKey; LandlordRegistrationPrimaryContactMobilePhoneNumberKey; TenantToLandlordApplicationReferenceNumberEmailAddressCount; LandlordRegistrationPrimaryContactAddressKey; LandlordRegistrationReferenceNumberPhoneKey" }], "temperature": 1, "max_tokens": 4000, "top_p": 0.95, "frequency_penalty": 0, "presence_penalty": 0, "stop": "None" }

The Response is:

   <usage>
   <completion_tokens>57</completion_tokens>   
   <prompt_tokens>424</prompt_tokens>   
   <total_tokens>481</total_tokens>   </usage>
  
   <model>gpt-35-turbo</model>  
   <id>chatcmpl-8zpo2gRbVrIM4P7fnvLQmPVkIfCfY</id>  
   <choices>
    <finish_reason>stop</finish_reason>   
    <index>0</index>   
    <message>
     <role>assistant</role>    
     <content>[{"keyword": "ApplicationStatusName", "context_response": "Application Status Name"}, {"keyword": "BankRoutingAccountNumber", "context_response": "Bank Routing Account Number"}, {"keyword": "BankAccountNumber", "context_response": "Bank Account Number"}, {"ke...</content>    </message>
    </choices>

Have you turned on the JSON mode?

This could solve it: How to use JSON mode with Azure OpenAI Service - Azure OpenAI | Microsoft Learn

I have! Unfortunately, I’m not in a region that has access to it yet. :frowning:

Have you tried it with OpenAI and see if the JSON mode fixes your issue?

The API response is JSON, not containers of tags. I expect that whatever in your “process” that is showing you this format is truncating the response with the elide marker.

You can improve the quality of this task by placing the mandatory output format and other permanent operations into a separate “system” role as the first message.

1 Like

Thanks Jay, that’s a good call.
I didn’t think that it might be the BPEL process engine that’s having an issue. I should try the same API call using Postman to see if it’s giving me the same issue.

Hi Jay,
I’m still seeing similar issues where Azure OpenAI is responding with … even when calling the API using Postman. :confused:

I’ll try adding an additional system role message first to see if that helps improve things. Thank you!

1 Like