pip install semantic-chunker-langchain
There is an issue with the models handling the large size of the pdf , txt files etc and when we pass the models like gpt 3.5 cant handle the 1000 + pages pdf result in token limit error as limit of 16, 000 limit token characters so for that i developed a token aware semantic chunker for langchain
Try it and would like to get your reviews on it
This topic was automatically closed after 15 hours. New replies are no longer allowed.