OpenAI Assistant Starter Kit

I’m excited to announce the OpenAI Assistant Starter Kit — a sample chat application designed to enable you to quickly get started building fully-functional chat web applications with OpenAI + JavaScript.

Try it out live right now by visiting: https://openai-assistant-starter-kit.vercel.app/

What’s special about this Starter Kit?

Take a quick tour through the code by reading my blog post: Use the OpenAI Assistant Starter Kit to Quickly Build New OpenAI Apps - OpenAI Blog - Stephen Walther on OpenAI

5 Likes

Welcome to the dev community.

Thanks for posting with the project tag. Makes it easier for us to keep up with updates.

2 Likes

Hi Margaret - thanks for asking!

I have a blog post that walks through the code at Use the OpenAI Assistant Starter Kit to Quickly Build New OpenAI Apps - OpenAI Blog - Stephen Walther on OpenAI

But, great idea, a video walkthrough would be more fun. I’ll put one together.

Stephen

1 Like

There have been several significant changes to the OpenAI APIs over the last month (April, 2024) and I’ve updated the OpenAI Assistant Starter Kit to take advantage of these changes. These changes have enabled me to significantly simplify the Starter Kit code.

What is the OpenAI Assistant Starter Kit?

As a reminder, the OpenAI Assistant Starter Kit is an open-source application that illustrates how you can easily start building OpenAI Large Language Model (LLM) applications using NextJS + TypeScript and OpenAI. You can view a live version of the Starter Kit here.

OpenAI-Assistant-Starter-Kit-Recording

Using the New AssistantStream Class

The biggest change to the OpenAI Node library is the introduction of the fromReadableStream() method to the AssistantStream class. This method enables you to easily read a stream from the server on the client.

OpenAI uses Server-Sent Events to stream text responses from an Assistant. Instead of waiting for OpenAI to compose the entire response to a chat message, the response can be sent back to the browser in chunks. That way, the user is not left waiting while nothing happens.

The OpenAI Assistant Starter Kit has been updated to use the AssistantStream.fromReadableStream() method to consume the response from the server. This new method was added to the AssistantStream class last month (see feat: assistant fromReadableStream by stainless-bot · Pull Request #738 · openai/openai-node · GitHub).

Learn More

Let me know if you have any questions or suggestions for improving the Starter Kit. Feedback is super welcome.

3 Likes