Generating large json data with chatgpt

I have a big json data for around 5082 token size, i need to implement the openai api in my project but before that i wanted to make sure if it works, so i gave chatgpt the original json and asked to make some changes in that(Because i need the gpt3.5 model for that purpose only). While generating the result, it didnt stop (which i suppose it would because of there is a token limit in response, as of what i read in one of the thread), but it kept on giving me continue option and kept on generating some garbage after some point, so i want to know if that behaviour is just because its on chatgpt and it will work fine on open ai api.