Does LangChain support OpenAI's Batch API?

As far as I’ve seen, no they don’t support the batch API. It’s not surprising, considering the asynchronous nature of the batch API would require a new paradigm designed and supported in langchain.

I am signaling my support for batch API support in langchain. Long-term, my product will need to utilize the batch API, and the main reason I’m using langchain is because of its broad support for models and providers. If I have to break out of langchain for this case, I’m more likely to break out for future use cases as well.

3 Likes