Node Library for Chat Completion via Socket.IO

In the tech landscape, the marriage of OpenAI’s language models with real-time applications hinges on a robust communication channel. Socket-IO emerges as a linchpin, ensuring instant, bidirectional communication between servers and clients. The significance lies in the acceleration of chat-based interactions, where timely responses are paramount.

openai-stream

// Server Implementation
import { Server } from "socket.io";
import { OpenAISocket } from "@musaid.qa/openai-socket";

const server = new Server();
const port = 2030;
const openai = new OpenAISocket(server, {
  verbose: true,
  client: {
    apiKey: process.env.OPENAI_API_KEY
  },
  chat: {
    model: 'gpt-3.5-turbo'
  },
  initMessages: [
    {
      role: 'system',
      content: 'You are a nodejs compiler'
    }
  ]
});

server.listen(port);
console.log(`Listening on port ${port}`);
// Client Implementation
import { Socket, io } from "socket.io-client"
import { EmitEvents } from "@musaid.qa/openai-socket";

const client: Socket<EmitEvents> = io('http://localhost:2030');

client.on('connect', () => {
  client.on('content', (content) => {
    console.log(content)
  });

  client.on('end', () => {
    console.log('end')
  });

  client.emit('new-message', 'Hello from earth!');
});