Getting multiple responses for one api call?

Hello,

Currently working with the gpt-3.5-turbo model, and I understand that as of right now there is no way to track session information without manually feeding it context from our end. I was wondering if it is possible to get multiple responses from one api call instead? For example: Could I send it multiple user messages in the same request and then get a list of responses back, one for each user message?

I haven’t been able to figure this out yet.

Thanks for any help!

1 Like