Does LangChain support OpenAI’s Batch API?
How should I use it in LangChain? I couldn’t find detailed information about the OpenAI Batch API in the LangChain documentation.
1 Like
As far as I’ve seen, no they don’t support the batch API. It’s not surprising, considering the asynchronous nature of the batch API would require a new paradigm designed and supported in langchain.
I am signaling my support for batch API support in langchain. Long-term, my product will need to utilize the batch API, and the main reason I’m using langchain is because of its broad support for models and providers. If I have to break out of langchain for this case, I’m more likely to break out for future use cases as well.
2 Likes