Uploading large files

Hey everyone,
I’m wanted to know if we can upload large files. I know that there is a 1 GB limit to the size of the file, but I tried uploading a database schema via the api but found out that there is a size/token limit for each line (JSON dictionary) of the JSONL file. Is there a way around it?
I researched a little on the forum and found out that chunking is a good option, but automating that process is a little tricky as I couldn’t find an exact relation between characters and tokens (how many characters is equal to one token).
Furthermore, does anyone know if it’s possible to integrate the codex engine with the answers api and if we can fine-tune it for a particular usecase?

1 Like

Yes it is very easy to use codex with answers api, set the model and search_model to codex series engines you want to use, add relevant example and example_context, pass your documents for factual answers and the question you want to answer.

Here’s API reference for /answers

2 Likes

Thank you! I’ll try it out😁

Thanks! I’ll take a look at embeddings and see if I can use it for my usecase!