Getting multiple responses for one api call?

Hello,

Currently working with the gpt-3.5-turbo model, and I understand that as of right now there is no way to track session information without manually feeding it context from our end. I was wondering if it is possible to get multiple responses from one api call instead? For example: Could I send it multiple user messages in the same request and then get a list of responses back, one for each user message?

I haven’t been able to figure this out yet.

Thanks for any help!

2 Likes

Did you solve the problem? In my case, I found that the responses to users were being repeated, meaning that more than one request was being made per answer to a user’s question. I built my application using Node.js and Express.

Hey @jcsavage @gfbane23 check out reliableGPT for this - python package to handle batch gpt calls.

got exactly same issue here. the gpt-3.5-turbo sometimes will return 2 exactly same reply.