Hello OpenAI Community,
I’m working on a project where I need to process multiple text messages, extracting structured data from each one (e.g., origin city, destination city, freight type, etc.). Currently, I’m making one OpenAI API call per message to get the structured output.
To optimize costs, I’d like to bundle multiple messages (within the token limit) into a single API call and receive a structured response for each message. However, I’m unsure how to best structure the input JSON and process the output in a way that keeps the responses matched to the corresponding input messages.
Here’s what I’m trying to achieve:
- Input: A JSON file containing multiple messages (up to the token limit) sent in a single API request.
- Desired Output: A structured JSON output where each message’s extracted information (e.g., origin city, destination city, freight type, etc.) is separately identifiable.
Has anyone worked on a similar use case, and can you share any guidance on how to structure the input and handle the response efficiently? Any advice on managing large files within token limits while maintaining structured output would be greatly appreciated.
Thank you in advance for your help!