Heyy. Can some one help me with this. I wanna summarize and categorize the call transcriptions of more than 1hour calls via llm’s. I’ve tried several methods by sending it to gpt 3.5 and gpt 4 via chatcompletions. But the token limit for one api call is 16k for 3.5 turbo and 8k for 4. But for an 1hr convo the tokens is like 25 to 30 k. I wont be able to run it via openai apis (exclude GPT-4-32K). Is there any way to run it on foundation models with the ability of GPT-4’s accuracy ? I’m more concerned about the accuracy and quality of the classification and summarization and I need the out in json format as it has to be pasted to sheets automatically. Please suggest me some methods.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Is there a way to beyond the maximum tokens in the playground? | 4 | 2518 | December 21, 2022 | |
Need more than a 4097 token call from chat gpt api | 7 | 3018 | November 28, 2023 | |
Using gpt to structure large amounts of data to json format | 9 | 2599 | May 23, 2024 | |
Creating a Continuity-Based Chatbot similar to chatGPT | 0 | 64 | October 21, 2024 | |
Handling text larger than limits? | 2 | 3143 | December 17, 2023 |