ChatGPT plugins are a powerful way to extend the capabilities of the ChatGPT language model by integrating external APIs and services. In this blog post, weāll explore how to build a ChatGPT plugin called āAI Surferā that allows ChatGPT to surf the internet, summarize articles, and limit token counts using concurrent API connections. Weāll also discuss how to deploy the plugin to Replit for free or to other cloud services.
ChatGPT plugins are integrations that allow ChatGPT to interact with external APIs, databases, or services. By using plugins, ChatGPT can perform tasks such as fetching data, summarizing articles, translating text, and much more. Plugins are defined using a manifest file (ai-plugin.json) and an OpenAPI specification (specification.yaml) that describe the pluginās metadata, authentication, and API endpoints.
AI Surfer Plugin: Overview
The AI Surfer plugin empowers ChatGPT to āsurfā the internet by summarizing the content of web pages provided by users. By inputting a URL, the plugin leverages OpenAIās GPT-3.5 Turbo language model to generate concise and informative summaries of the web pageās content. The pluginās key features and benefits include:
Web Content Summarization: The AI Surfer plugin can distill the essential information from articles, blog posts, and other web content, providing users with quick and easy-to-understand summaries.
Concurrent API Connections: To efficiently handle long articles and reduce token counts, the plugin uses concurrent API connections to process and summarize different sections of the content simultaneously.
Language Model Integration: The plugin integrates with OpenAIās GPT-3.5 Turbo language model, harnessing its natural language processing capabilities to produce high-quality summaries.
Adjustability and Flexibility: The plugin is fully adjustable, allowing developers to customize its behavior and output to suit specific use cases.
Deployment Options: The AI Surfer plugin can be deployed to various cloud services, including Replit, AWS, Heroku, and more, providing flexibility in hosting and scalability.
By enabling ChatGPT to summarize web content, the AI Surfer plugin enhances the language modelās capabilities, allowing users to quickly access and understand information from across the web.
Full How-to and Code here ( star and follow if you like it.)
Hey, I appreciate all the work youāve done on this and for sharing it. I donāt know if this is the right place to ask, or if somewhere else would be better. Iām trying to get the plugin to run in Replit and Iām getting the error:
Traceback (most recent call last): File "main.py", line 9, in <module> import openai # OpenAI's GPT-3 language model library ModuleNotFoundError: No module named 'openai'
I also tried pip installing it and got errors. I upgraded pip and now when I try to pip install I get:
An error occurred during configuration: option use-feature: invalid choice: 'content-addressable-pool' (choose from 'fast-deps', 'truststore', 'no-binary-enable-wheel-cache')
Thank you. First time using Replit. Maybe I should look at a tutorial. Should I be able to run it locally? Iām getting an error with that too. I tried:
poetry run python main.py
and got
Traceback (most recent call last): File "/Users/larry/Dev/OpenAI/Surfer/main.py", line 36, in <module> nlp = spacy.load("en_core_web_sm") File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/__init__.py", line 54, in load return util.load_model( File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/util.py", line 449, in load_model raise IOError(Errors.E050.format(name=name)) OSError: [E050] Can't find model 'en_core_web_sm'. It doesn't seem to be a Python package or a valid path to a data directory.
Ok, I see a pyproject.toml and a poetry.lock file, so I assumed that doing a poetry install would take care of installing all the packages locally. Anyway, thatās what I did. The error doesnāt tell me thereās a package missing. It says it canāt find the model file. I assume by missing modules you mean packages? What am I not understanding here?
[Edit]
I can see that the model file itās complaining about is here:
Ok, I installed as a plugin. I had to change a couple addresses in ai-plugin.json, but it works! Finally. Sorry for all the hassle getting it going. Thanks for helping me though it! Works great!
Yes, thatās really awesome. Iāll be using it. But this taught me a lot about building a bit more complex plugin than the one for Todos, and how to incorporate other LLMs in the process. Itās a great tutorial/example to help people get started.
For anyone else struggling to get it working, as I was, it was mostly my fault because as I said above, Iād never used replit, hereās some things to look out for:
First, as we discovered above, the model en_core_web_sm was missing, which you can get with: python -m spacy download en_core_web_sm.
In the `ai-plugin.jsonā manifest file thereās two places where the domain is used, one to tell it where the openapi.yaml file is, and one to tell it where logo url is. Without the openai.yaml file location the plugin wonāt install. I replaced them manually, but I believe the DOMAIN environment variable will do that for you if you supply a value. Iāll have to try that.
There also may have been a couple requirements missing, hereās what should have been in the requirements.txt file:
I was also stuck on the fact that the output doesnāt show on the summary page, but if you install the plugin and test it, you get a summary from ChatGPT.
As far as replit goes, I dragged a folder from my local machine and dumped it on the replit file folder which cause me all kinds of headaches. Also I tried to use poetry in replit instead of using pip install -r requirements.txt
That made it work and I learned a few things. Ruvās repo deserves some stars.