ChatGPT stops streaming output in the middle of a sentence. Is this related to max token length for the output? I am using custom GPTs that I created with vision.