Gradual Response in Chat with Blazor 7 and OpenAI GPT Completions

Hello, community!

I’m working on a chat application using Blazor 7, and I’ve integrated the OpenAI GPT “completions” API. I’m facing an interesting challenge and seeking some advice. I’m aiming to present the responses from the OpenAI API to the user gradually, creating an effect similar to ChatGPT, where the response seems to be typed in real-time.

Currently, my application sends requests to the OpenAI “completions” endpoint and receives complete responses, which are immediately displayed to the user. I’d like to modify this so that the responses are displayed in a more dynamic and interactive manner.

I’m considering the use of WebSockets to achieve this, but I’m not sure about the best approach within the Blazor 7 environment. Does anyone have experience or insights on implementing such a feature, particularly with the OpenAI GPT “completions” API in Blazor? Any suggestions or recommendations on how to approach this gradual response display, or thoughts on alternative technologies suitable for this functionality, would be greatly appreciated.

Thank you in advance for your help!

Hello Icesarbarreto, I’m a computer science student and right now I want to try to continue to educate myself with some little projects. As I dont have a lot of know how about implementing a OpenAI API I try to gather some informations in the internet. I’m using blazor 7 too but i dont get any progress, I always get error codes and I dont really know how to fix it. I tried recreating tutorials but somehow implementing my API key is a big Challange for me. Do you have some advice for me? a good tutorial? If you’d like to help me getting a simple respond it would mean the world to me. Add me on Discord if you want: o5.Fynn or resond here. Good Luck with your Project!