We need to have multiple prompts in one request for the chat endpoint.
This is so us programmers can use the chat endpoint in a similar fashion to the completions endpoint where we can currently send 20 prompts per request.
This would be very useful as sending a whole request per prompt is very inefficient.
If I am understanding you correctly, I don’t think any context is required, however this is besides the point, I have a brief setup of an interaction, but I’d like to send these extra user prompts in bulk because it’s very inefficient using a hole request for each prompt. This is not supported with the https://api.openai.com/v1/chat/completions endpoint.