I have the the python 3 code below. In the code I am using llama_index from meta to create an index object from my own text corpus. I’m then passing queries to that index object to get responses back from openai’s chatgpt, using my additional text corpus index. I have to provide my openai api key from my paid openai account to get the index created or the responses back. my assumption is that llama_index is basically chopping my text corpus up into chunks. then chatgpt creates the embeddings for that chopped up corpus, to create the index object. then when I pass in a query chatgpt creates a similar embeding for the query, does the inner product with the index I already created from my corpus, and returns a response.
I’ve heard that llama_index is only available for research use. so I’m wondering if I can use it in this scenario as part of a commercial app? Since I’m paying for my openai account and api key, and as far as I can tell llama_index is a library I installed in my env that helps chop up corpus and pass to an LLM. Does anyone know if llama_index can be used in a commercial pipeline like this? is there something I’m missing about the processes? I’ve been hitting rate limits lately which I’m surprised at since I haven’t been doing that much with it. so I’m wondering if they’re comming from llama_index and not openai.
code:
def index_response(api_key,text_path,query):
# api key you generate in your openai account
import os
# add your openai api key here
os.environ['OPENAI_API_KEY'] = api_key
# Load you data into 'Documents' a custom type by LlamaIndex
from llama_index import SimpleDirectoryReader
documents = SimpleDirectoryReader(text_path).load_data()
from llama_index import GPTVectorStoreIndex
index = GPTVectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query(query)
return response.response
if I can’t use llama_index for commercial app like this, is there another library anyone can suggest and example of creating the embeddings? all help is very much appreciated.