Batching in OPENAI API Python

Heyy. I have a set of comments like 1000 review comments in rows and I want to classify it using chatgpt api. I can’t directly put the 1000 comments into the api because the gpt might struggle along with the system prompt that holds the classification instructions. So I want to reduce the cost and token usage and so I’m running it in batches in 100 rows per batch and a total of 10 batches. I’m using python. Which is best? To run it directly in for loop batches in code itself or to run it using OPEN AI BATCH API command? (This is the normal loop that I follow "batch_size = 100
def batch_process_and_update(comments, batch_size, worksheet):
start_time = time.time()
num_batches = (len(comments) + batch_size - 1) // batch_size # Calculate number of batches

for batch_num in range(num_batches):
    start_index = batch_num * batch_size
    end_index = min((batch_num + 1) * batch_size, len(comments))

    # Prepare the batch of comments for the API
    batch_comments = comments[start_index:end_index]
    generate_response(system_instructions, batch_comments)")