How do I stop the stream from the model when it's malfunctioning?

I just get these with differing lengths of nothing:

ChatCompletionChunk(id=‘chatcmpl-8ZF0BrBsVQbK91yHTku32fXyuF79N’, choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, role=None, tool_calls=[ChoiceDeltaToolCall(index=0, id=None, function=ChoiceDeltaToolCallFunction(arguments=’ ', name=None), type=None)]), finish_reason=None, index=0, logprobs=None)], created=1703409839, model=‘gpt-4-1106-preview’, object=‘chat.completion.chunk’, system_fingerprint=‘fp_3905aa4f79’)

And it just never stops. Fortunately it doesn’t count tokens on it, but I have to completely shut down the server to make it stop I can’t even control+c it. Is there a different way?