Missing First Token in Stream Response from /v1/chat/completions Endpoint

While streaming a response from the specified endpoint using the ‘gpt-4’ model, it has been observed that the very first token is consistently missing from the received data. This issue is impacting the completeness and accuracy of the streamed response. It is essential to investigate and address this problem to ensure the correct functioning of the streaming process.
My code snippet below in Flutter:

Stream chatStream =
await OpenAI.instance.chat.createStream(
model: “gpt-4”,
maxTokens: isFilesChat ? 500 : 200,
temperature: 0.7,
messages: messages,
(chatStreamEvent) async {
chatStreamSubscription = chatStreamSubject.stream.listen(
(OpenAIStreamChatCompletionModel chatStreamEvent) async {
String? _chunkResponse = chatStreamEvent.choices[0].delta.content;
if (_chunkResponse != null) {

         String time = DateTime.now().millisecondsSinceEpoch.toString();
          await setMessageBeforeFirebaseResponse(
            context: context,
            answer: _chunkResponse,
            userQuery: userQuery,
            isLoaded: false,
            time: time,
    return Future.value('Empty');
  onDone: () async {
    StreamResponseProvider responseMessage =
        Provider.of<StreamResponseProvider>(context, listen: false);
    responseMessage.isStreamDone = true;

1 Like