Messaging chatbot with API

I’m creating a chatbot with an api. The user sends a message and the chat responds, but with each new request, the chat restarts the conversation flow. From my analysis I came to the conclusion that this occurs because with each new request the chat sends a different ID, so in order to maintain the same conversation flow I tried to capture the ID sent by the chat, and I managed to do so, but I cannot send new requests to the chat. with the ID to continue the flow. I am using javascript to create the application on a web page.

`const apiKey = ‘API_KEY’;
let chatId = ‘’

$(document).ready(function () {

$('.send-button').on('click', function () {
    sendMessage();
});


$('.chat-footer textarea').on('input', function () {
    capitalizeFirstLetter();
});

});

function capitalizeFirstLetter() {
var messageInput = $(‘.chat-footer textarea’);
messageInput.val(capitalize(messageInput.val()));
}

function capitalize(str) {
return str.charAt(0).toUpperCase() + str.slice(1);
}
function setChatId(id) {
chatId = id;
}
function sendMessage() {
var messageInput = $(‘.chat-footer textarea’);
var message = messageInput.val();

if (!message) {
    messageInput.css('border', '1px solid red');
    return;
}

messageInput.css('border', 'none');

var status = $('.status');
var btnSubmit = $('.send-button');

status.text('Escrevendo...').show();
btnSubmit.prop('disabled', true).css('cursor', 'not-allowed');
messageInput.prop('disabled', true);


addMessage('user', message);


fetchBotResponse(message);
messageInput.val('');

}

function fetchBotResponse(userMessage) {
const introMessage = { role: ‘system’, content:‘Prompt’};

    const messages = [ 
    { role: 'system', content: JSON.stringify(introMessage) },
    { role: 'user', content: userMessage },
    ]
    const fluxo =[
        { role: 'user', content: userMessage },
        {role:'assistant', content: chatId}
    ]
;
messages.push({ role: 'assistant', content: chatId });
fetch("https://api.openai.com/v1/chat/completions", {
    method: 'POST',
    headers: {
        Accept: "application/json",
        "Content-Type": "application/json",
        Authorization: `Bearer ${apiKey}`,
    },
        body: JSON.stringify({
            model: "gpt-3.5-turbo",
            messages:
                messages
            ,
            temperature:0.7, 
            max_tokens: 250
    })
})
.then((response) => {
    if (!response.ok) {
        throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
})
.then((response) => {
    let botResponse = response.choices[0].message.content;
    chatId = response.id;
    addMessage('chatbot', botResponse);
    console.log(chatId)
})
.catch((e) => {
    console.log(`Error -> ${e}`);
})
.finally(() => {
    var status = $('.status');
    var btnSubmit = $('.send-button');
    var messageInput = $('.chat-footer textarea');
    
    status.text('Online').show();;
    btnSubmit.prop('disabled', false).css('cursor', 'pointer');
    messageInput.prop('disabled', false).val('');
});

}

function addMessage(role, content) {
const messageContainer = document.createElement(‘div’);
const message = document.createElement(‘div’);
const timestamp = getCurrentTimestamp();

messageContainer.classList.add('chat-l', role === 'user' ? 'sent' : 'chat-r');
messageContainer.innerHTML = '<div class="sp"></div>';
message.classList.add('mess', 'mess-' + role);

if (role === 'user') {
    message.innerHTML = `<p>${content}</p><div class="check"><span>${timestamp}</span><img src="img/check-2.webp" class="mic"></div>`;
} else {
    message.innerHTML = `<p>${content}</p><div class="check"><span>${timestamp}</span></div>`;
}
messageContainer.appendChild(message);

$('.chat-box').append(messageContainer);
$('.chat-box').scrollTop($('.chat-box')[0].scrollHeight);

}

function getCurrentTimestamp() {
const now = new Date();
const hours = now.getHours().toString().padStart(2, ‘0’);
const minutes = now.getMinutes().toString().padStart(2, ‘0’);
return hours + ‘:’ + minutes;
}

function formatInputText() {
messageInput.value = messageInput.value.charAt(0).toUpperCase() + messageInput.value.slice(1);
}
`

The chat completions API doesn’t maintain a conversation server-side.

The ID return is of no use to you.

You must maintain a user’s conversation history yourself, and send enough chat as prior user/assistant exchanges before the latest user input so the AI can follow the topic.

Turn one:

system: You are a friendly chatbot
user: choose a new name for yourself

(receive answer): Sure! How about “BuddyBot”? It reflects my friendly nature and willingness to help.

Turn two:

system: You are a friendly chatbot
user: choose a new name for yourself
assistant: Sure! How about “BuddyBot”? It reflects my friendly nature and willingness to help.
user: If I ask “what is your name”, how do you now respond?

(receive informed answer): As BuddyBot, my name is BuddyBot. How can I assist you today?

2 Likes

Does this mean that in each new turn the token counter grows exponentially?

1 Like

That would be a yes. You can manage it to only send X responses back if you want to have some control and limit the number of tokens… Or summarize long stretches of previous chat then add that to the message…

The length of the conversation will grow as a chat session progresses.

You pay for the total input sent each API call.

It is up to you to decide the trade-off between cost and quality when deciding how many past turns (or a token limit) of chat history you send to accompany the latest question.

If you have a chatbot with maximum of five past turns as your cutoff, each API call will settle on an average price of those turns (in what appears to the user to be a long chat).

technically square :wink: