Thanks . Also , I checked openai’s cookbook had a method for batching for sending multiple requests in parallel .
As in docs :
Batching requests
The OpenAI API has separate limits for requests per minute and tokens per minute.
If you're hitting the limit on requests per minute, but have headroom on tokens per minute, you can increase your throughput by batching multiple tasks into each request. This will allow you to process more tokens per minute, especially with the smaller models.
Sending in a batch of prompts works exactly the same as a normal API call, except that pass in a list of strings to prompt parameter instead of a single string.
Warning: the response object may not return completions in the order of the prompts, so always remember to match responses back to prompts using the index field.
1. Example without batching
num_stories = 10
content = "Once upon a time,"
# serial example, with one story completion per request
for _ in range(num_stories):
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": content}],
max_tokens=20,
)
# print story
print(content + response.choices[0].message.content)
Once upon a time,in a small village nestled between rolling green hills, there lived a young girl named Lily. She had
Once upon a time,in a small village nestled in the heart of a lush forest, lived a young girl named Evelyn.
Once upon a time,in a faraway kingdom, there lived a young princess named Aurora. She was known for her kind
Once upon a time,in a faraway kingdom called Enchantia, there lived a young girl named Ella. Ella was
Once upon a time,in a small village nestled among the rolling hills, lived a young woman named Lucy. Lucy was known
Once upon a time,in a small village nestled between rolling hills, there lived a young girl named Ava. Ava was a
Once upon a time,in a faraway kingdom, there lived a wise and just king named Arthur. King Arthur ruled over
Once upon a time,in a small village nestled among towering mountains, lived a young girl named Lily. She was known for
Once upon a time,in a small village nestled in the heart of a lush forest, there lived a young girl named Lily
Once upon a time,in a far-off kingdom, there lived a kind and beloved queen named Isabella. She ruled with
2. Example with batching
num_stories = 10
prompts = ["Once upon a time,"] * num_stories
# batched example, with 10 stories completions per request
response = client.chat.completions.create(
model="curie",
prompt=prompts,
max_tokens=20,
)
# match completions to prompts by index
stories = [""] * len(prompts)
for choice in response.choices:
stories[choice.index] = prompts[choice.index] + choice.text
# print stories
for story in stories:
print(story)
Once upon a time, I lived in hope. I convinced myself I knew best, because, naive as it might sound,
Once upon a time, Thierry Henry was invited to have a type of frosty exchange with English fans, in which
Once upon a time, and a long time ago as well, PV was passively cooled because coils cooled by use of metal driving
Once upon a time, there was a land called Texas. It was about the size of Wisconsin. It contained, however,
Once upon a time, there was an old carpenter who had three sons. The locksmith never learned to read or write
Once upon a time, there was a small farming town called Moonridge Village, far West across the great vast plains that lay
Once upon a time, California’s shorelines, lakes, and valleys were host to expanses of untamed wilderness
Once upon a time, she said. It started with a simple question: Why don’t we know any stories?
Once upon a time, when I was a young woman, there was a movie named Wuthering Heights. Stand by alleges
Once upon a time, a very long time I mean, in the year 1713, died a beautiful Duchess called the young
Can you tell me some limitations of this method ?