ruv
1
Building a ChatGPT Plugin: AI Web Surfer
Introduction
ChatGPT plugins are a powerful way to extend the capabilities of the ChatGPT language model by integrating external APIs and services. In this blog post, weāll explore how to build a ChatGPT plugin called āAI Surferā that allows ChatGPT to surf the internet, summarize articles, and limit token counts using concurrent API connections. Weāll also discuss how to deploy the plugin to Replit for free or to other cloud services.
Built Using My Plugin Creator Bot.
What are ChatGPT Plugins?
ChatGPT plugins are integrations that allow ChatGPT to interact with external APIs, databases, or services. By using plugins, ChatGPT can perform tasks such as fetching data, summarizing articles, translating text, and much more. Plugins are defined using a manifest file (ai-plugin.json) and an OpenAPI specification (specification.yaml) that describe the pluginās metadata, authentication, and API endpoints.
AI Surfer Plugin: Overview
The AI Surfer plugin empowers ChatGPT to āsurfā the internet by summarizing the content of web pages provided by users. By inputting a URL, the plugin leverages OpenAIās GPT-3.5 Turbo language model to generate concise and informative summaries of the web pageās content. The pluginās key features and benefits include:
-
Web Content Summarization: The AI Surfer plugin can distill the essential information from articles, blog posts, and other web content, providing users with quick and easy-to-understand summaries.
-
Concurrent API Connections: To efficiently handle long articles and reduce token counts, the plugin uses concurrent API connections to process and summarize different sections of the content simultaneously.
-
Language Model Integration: The plugin integrates with OpenAIās GPT-3.5 Turbo language model, harnessing its natural language processing capabilities to produce high-quality summaries.
-
Adjustability and Flexibility: The plugin is fully adjustable, allowing developers to customize its behavior and output to suit specific use cases.
-
Deployment Options: The AI Surfer plugin can be deployed to various cloud services, including Replit, AWS, Heroku, and more, providing flexibility in hosting and scalability.
By enabling ChatGPT to summarize web content, the AI Surfer plugin enhances the language modelās capabilities, allowing users to quickly access and understand information from across the web.
Full How-to and Code here (
star and follow if you like it.)
Join my Reddit
5 Likes
ruv
2
Thinking about adding support for AutoGPT to this plug-inā¦
2 Likes
Hey, I appreciate all the work youāve done on this and for sharing it. I donāt know if this is the right place to ask, or if somewhere else would be better. Iām trying to get the plugin to run in Replit and Iām getting the error:
Traceback (most recent call last): File "main.py", line 9, in <module> import openai # OpenAI's GPT-3 language model library ModuleNotFoundError: No module named 'openai'
I also tried pip installing it and got errors. I upgraded pip and now when I try to pip install I get:
An error occurred during configuration: option use-feature: invalid choice: 'content-addressable-pool' (choose from 'fast-deps', 'truststore', 'no-binary-enable-wheel-cache')
poetry add openai also does not work.
`
TypeError
expected string or bytes-like object
at venv/lib/python3.10/site-packages/poetry/core/utils/helpers.py:27 in canonicalize_name
23ā canonicalize_regex = re.compile(r"[-]+ā)
24ā
25ā
26ā def canonicalize_name(name): # type: (str) ā str
ā 27ā return canonicalize_regex.sub(ā-ā, name).lower()
28ā
29ā
30ā def module_name(name): # type: (str) ā str
31ā return canonicalize_name(name).replace(ā.ā, "ā).replace(ā-ā, ā_ā)
`
ruv
4
Look like you didnāt install the modules. I should probably update the requirements.txt, pip install module
ruv
5
Try adding the following to the requirements.txt in your replit. You may need to click show hidden files
openai
requests
beautifulsoup4
fastapi
jinja2
spacy
httpx
uvicorn
Ok, Iāll try that, but openai is already in requirements.txt and I can see it in the Replit packages. Hereās whatās in requirements.txt now.
Flask==2.1.1
flask_cors==3.1.1
requests==2.27.1
beautifulsoup4==4.10.0
openai==0.27.0
[Edit]
I tried it, still the same message.
ruv
7
Iāll publish the replit, youāll be able to just fork it. Iāll post here when itās live.
1 Like
Thank you. First time using Replit. Maybe I should look at a tutorial. Should I be able to run it locally? Iām getting an error with that too. I tried:
poetry run python main.py
and got
Traceback (most recent call last): File "/Users/larry/Dev/OpenAI/Surfer/main.py", line 36, in <module> nlp = spacy.load("en_core_web_sm") File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/__init__.py", line 54, in load return util.load_model( File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/util.py", line 449, in load_model raise IOError(Errors.E050.format(name=name)) OSError: [E050] Can't find model 'en_core_web_sm'. It doesn't seem to be a Python package or a valid path to a data directory.
ruv
9
Yeah, you can run it locallly. Youāre Still missing the modules, it wonāt run if theyāre not installed, try upgrading your pip
1 Like
Ok, I see a pyproject.toml and a poetry.lock file, so I assumed that doing a poetry install would take care of installing all the packages locally. Anyway, thatās what I did. The error doesnāt tell me thereās a package missing. It says it canāt find the model file. I assume by missing modules you mean packages? What am I not understanding here?
[Edit]
I can see that the model file itās complaining about is here:
./venv/lib/python3.10/site-packages/en_core_web_sm
But when I put that path in, it still complains that it canāt find it.
ruv
11
python -m spacy download en_core_web_sm
I used this module so you could plug-in other LLMs easily.
1 Like
Ok, I see, Iām missing models. Iāll try that, thanks.
Ok, I installed as a plugin. I had to change a couple addresses in ai-plugin.json, but it works! Finally. Sorry for all the hassle getting it going. Thanks for helping me though it! Works great!
Yes, thatās really awesome. Iāll be using it. But this taught me a lot about building a bit more complex plugin than the one for Todos, and how to incorporate other LLMs in the process. Itās a great tutorial/example to help people get started.
For anyone else struggling to get it working, as I was, it was mostly my fault because as I said above, Iād never used replit, hereās some things to look out for:
-
First, as we discovered above, the model en_core_web_sm was missing, which you can get with: python -m spacy download en_core_web_sm.
-
In the `ai-plugin.jsonā manifest file thereās two places where the domain is used, one to tell it where the openapi.yaml file is, and one to tell it where logo url is. Without the openai.yaml file location the plugin wonāt install. I replaced them manually, but I believe the DOMAIN environment variable will do that for you if you supply a value. Iāll have to try that.
-
There also may have been a couple requirements missing, hereās what should have been in the requirements.txt file:
openai
requests
beautifulsoup4
fastapi
jinja2
spacy
httpx
uvicorn
- I was also stuck on the fact that the output doesnāt show on the summary page, but if you install the plugin and test it, you get a summary from ChatGPT.
As far as replit goes, I dragged a folder from my local machine and dumped it on the replit file folder which cause me all kinds of headaches. Also I tried to use poetry in replit instead of using pip install -r requirements.txt
That made it work and I learned a few things. Ruvās repo deserves some stars.
thank you bing chat for pointing me here - a great loop in action - and thanks @ruv for sharing. interestingly, it took several prompts to point here
1 Like
ruv
17
Thanks bing!
(Additional 25 characters)
1 Like
Hi @ruv a quick message to say a big thank you for your work. you are a genius! love this openai-api-plugin 
ruv
19
I think the OpenAi api plug-in might be my best one yet. Itās crazy useful when used with other plugins.
1 Like