Does LangChain support OpenAI’s Batch API?
How should I use it in LangChain? I couldn’t find detailed information about the OpenAI Batch API in the LangChain documentation.
As far as I’ve seen, no they don’t support the batch API. It’s not surprising, considering the asynchronous nature of the batch API would require a new paradigm designed and supported in langchain.
I am signaling my support for batch API support in langchain. Long-term, my product will need to utilize the batch API, and the main reason I’m using langchain is because of its broad support for models and providers. If I have to break out of langchain for this case, I’m more likely to break out for future use cases as well.
Hi @henry5 I built a tool that allows you to integrate OpenAIs Batch API with langchain with zero lines of code. You can use it by doing pip install langasync. The documentation is on github if you search for “langasync”
Let me know what you think