API with 4Turbo modal not providing a clean JSON

Hello,
Maybe someone here can help

I have a prompt that worked well with GPT 4 and returned the response in JSON format, but now as I’m testing 4 Turbo modal its not working as expected.
The repones has the following in the begging it: “``json”
And that is something that is ‘messing up’ my system that is expecting a JSON formatted response.

Yes, that‘s a known behaviour of GPT-4-turbo models. What prevents you from just extracting the actual JSON after this “prefix“?

Hi @jet.isaacson

Are you using JSON mode?

Are you explicitly asking the model to generate JSON and providing the object structure in the system message?

You should never trust a stochastic model to perfectly adhere to any particular standard.

You should always verify and correct the prior from the models before passing it on in your workflow.

I’ve used regex to clean the “json" from the beginning of the response and the "” form the end of it

Thanks everybody

Removing the json text from the beginning is definitely the simple solution I went with meanwhile.
I am just worried that this is not stable enough, that the ‘prefix’ might change and corrupt the response again.

Wondering if there’s a known solution to ‘not’ receive it originally in the response

Using JSON mode as suggested by sps should resolve the problem and only return the JSON object. I have justed tested it with a few different examples using both gpt-4-1106-preview and gpt-4-0125-preview and it worked fine, i.e. no more ``json was included in the response.

image

1 Like

Amazing! can always trust the great community here for quality help.

I’ll check about the JSON mode as @jr.2509 & @sps suggested and report back.

Thanks!