Need some help, am I making the correct type of calls?

Basically what I want to build is a chatbot I can chat with about my WhatsApp or Facebook messages. I’ve got an export of my Facebook messages, with a to and from column, and a date/time stamp.

They are in an excel file, I’ve loaded them up, passed them through the API but I want to know if there’s a better way, more effective way.

I also want to know if it’s possible for it to dynamically filter things for me. For example, show me messages between 1st May 2023 and 2nd May 2023 or show me the messages where my friend told a joke…

This is my api call and instructions this is what where I’m not sure if I can be doing something better, maybe chunking it up or something am I missing anything silly?

import openai

class OpenAIClient:
    def __init__(self, api_key):
        self.client = openai.OpenAI(api_key=api_key)

    def get_response(self, prompt, conversation_history, max_tokens=150):
        system_prompt = (
            "SYSTEMPROMPTHERE"
        )

        # Combine system prompt and user prompt with context from conversation history
        context_prompt = f"Conversation History:\n{conversation_history}\n\nUser Question: {prompt}\nAI Response:"

        messages = [
            {"role": "system", "content": system_prompt},
            {"role": "user", "content": context_prompt}
        ]

        response = self.client.chat.completions.create(
            model="gpt-4",
            messages=messages,
            max_tokens=max_tokens
        )

        return response.choices[0].message.content.strip()’’’

Then this is how I grab the response and show it in the chat box:

‘’’ # Third row: Generative AI input/output and Selected chat messages
col3, col4 = st.columns(2)

with col3:
    st.subheader("Ask a Question")
    user_input = st.text_input("Enter your question")
    if st.button("Submit"):
        messages = get_conversation(df, selected_conversation, start_date, end_date)
        truncated_messages = truncate_conversation(messages)
        context = "\n".join(truncated_messages)
        prompt = f"Conversation:\n{context}\n\nUser Question: {user_input}\nAI Response:"
        response = openai_client.get_response(prompt, max_tokens=150)
        st.write(response)’’’

Did I do it right? If the message thread is to long do I need to do something else? How can I make it interact and filter things on my behalf. Anything else I could implement? Any ideas?