Gradual Response in Chat with Blazor 7 and OpenAI GPT Completions

Hello, community!

I’m working on a chat application using Blazor 7, and I’ve integrated the OpenAI GPT “completions” API. I’m facing an interesting challenge and seeking some advice. I’m aiming to present the responses from the OpenAI API to the user gradually, creating an effect similar to ChatGPT, where the response seems to be typed in real-time.

Currently, my application sends requests to the OpenAI “completions” endpoint and receives complete responses, which are immediately displayed to the user. I’d like to modify this so that the responses are displayed in a more dynamic and interactive manner.

I’m considering the use of WebSockets to achieve this, but I’m not sure about the best approach within the Blazor 7 environment. Does anyone have experience or insights on implementing such a feature, particularly with the OpenAI GPT “completions” API in Blazor? Any suggestions or recommendations on how to approach this gradual response display, or thoughts on alternative technologies suitable for this functionality, would be greatly appreciated.

Thank you in advance for your help!