🎯[How-To] Build a ChatGPT Plugin: AI Web Surfer - A Powerful Plugin That Summarizes Web Content and Enhances Conversational AI Experiences (Open Source Code)

Building a ChatGPT Plugin: AI Web Surfer


ChatGPT plugins are a powerful way to extend the capabilities of the ChatGPT language model by integrating external APIs and services. In this blog post, we’ll explore how to build a ChatGPT plugin called “AI Surfer” that allows ChatGPT to surf the internet, summarize articles, and limit token counts using concurrent API connections. We’ll also discuss how to deploy the plugin to Replit for free or to other cloud services.

:star: Built Using My Plugin Creator Bot.

What are ChatGPT Plugins?

ChatGPT plugins are integrations that allow ChatGPT to interact with external APIs, databases, or services. By using plugins, ChatGPT can perform tasks such as fetching data, summarizing articles, translating text, and much more. Plugins are defined using a manifest file (ai-plugin.json) and an OpenAPI specification (specification.yaml) that describe the plugin’s metadata, authentication, and API endpoints.

AI Surfer Plugin: Overview

The AI Surfer plugin empowers ChatGPT to “surf” the internet by summarizing the content of web pages provided by users. By inputting a URL, the plugin leverages OpenAI’s GPT-3.5 Turbo language model to generate concise and informative summaries of the web page’s content. The plugin’s key features and benefits include:

  • Web Content Summarization: The AI Surfer plugin can distill the essential information from articles, blog posts, and other web content, providing users with quick and easy-to-understand summaries.

  • Concurrent API Connections: To efficiently handle long articles and reduce token counts, the plugin uses concurrent API connections to process and summarize different sections of the content simultaneously.

  • Language Model Integration: The plugin integrates with OpenAI’s GPT-3.5 Turbo language model, harnessing its natural language processing capabilities to produce high-quality summaries.

  • Adjustability and Flexibility: The plugin is fully adjustable, allowing developers to customize its behavior and output to suit specific use cases.

  • Deployment Options: The AI Surfer plugin can be deployed to various cloud services, including Replit, AWS, Heroku, and more, providing flexibility in hosting and scalability.

By enabling ChatGPT to summarize web content, the AI Surfer plugin enhances the language model’s capabilities, allowing users to quickly access and understand information from across the web.

Full How-to and Code here (:star: star and follow if you like it.)

Join my Reddit


Thinking about adding support for AutoGPT to this plug-in…


Hey, I appreciate all the work you’ve done on this and for sharing it. I don’t know if this is the right place to ask, or if somewhere else would be better. I’m trying to get the plugin to run in Replit and I’m getting the error:

Traceback (most recent call last): File "main.py", line 9, in <module> import openai # OpenAI's GPT-3 language model library ModuleNotFoundError: No module named 'openai'
I also tried pip installing it and got errors. I upgraded pip and now when I try to pip install I get:

An error occurred during configuration: option use-feature: invalid choice: 'content-addressable-pool' (choose from 'fast-deps', 'truststore', 'no-binary-enable-wheel-cache')

poetry add openai also does not work.


expected string or bytes-like object

at venv/lib/python3.10/site-packages/poetry/core/utils/helpers.py:27 in canonicalize_name
23│ canonicalize_regex = re.compile(r"[-]+“)
26│ def canonicalize_name(name): # type: (str) → str
→ 27│ return canonicalize_regex.sub(“-”, name).lower()
30│ def module_name(name): # type: (str) → str
31│ return canonicalize_name(name).replace(“.”, "
”).replace(“-”, “_”)

Look like you didn’t install the modules. I should probably update the requirements.txt, pip install module

Try adding the following to the requirements.txt in your replit. You may need to click show hidden files


Ok, I’ll try that, but openai is already in requirements.txt and I can see it in the Replit packages. Here’s what’s in requirements.txt now.


I tried it, still the same message.

I’ll publish the replit, you’ll be able to just fork it. I’ll post here when it’s live.

1 Like

Thank you. First time using Replit. Maybe I should look at a tutorial. Should I be able to run it locally? I’m getting an error with that too. I tried:

poetry run python main.py

and got

Traceback (most recent call last): File "/Users/larry/Dev/OpenAI/Surfer/main.py", line 36, in <module> nlp = spacy.load("en_core_web_sm") File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/__init__.py", line 54, in load return util.load_model( File "/Users/larry/opt/anaconda3/envs/surfer/lib/python3.10/site-packages/spacy/util.py", line 449, in load_model raise IOError(Errors.E050.format(name=name)) OSError: [E050] Can't find model 'en_core_web_sm'. It doesn't seem to be a Python package or a valid path to a data directory.

Yeah, you can run it locallly. You’re Still missing the modules, it won’t run if they’re not installed, try upgrading your pip

1 Like

Ok, I see a pyproject.toml and a poetry.lock file, so I assumed that doing a poetry install would take care of installing all the packages locally. Anyway, that’s what I did. The error doesn’t tell me there’s a package missing. It says it can’t find the model file. I assume by missing modules you mean packages? What am I not understanding here?

I can see that the model file it’s complaining about is here:


But when I put that path in, it still complains that it can’t find it.

python -m spacy download en_core_web_sm

I used this module so you could plug-in other LLMs easily.

1 Like

Ok, I see, I’m missing models. I’ll try that, thanks.

Ok, I installed as a plugin. I had to change a couple addresses in ai-plugin.json, but it works! Finally. Sorry for all the hassle getting it going. Thanks for helping me though it! Works great!

Yes, that’s really awesome. I’ll be using it. But this taught me a lot about building a bit more complex plugin than the one for Todos, and how to incorporate other LLMs in the process. It’s a great tutorial/example to help people get started.

For anyone else struggling to get it working, as I was, it was mostly my fault because as I said above, I’d never used replit, here’s some things to look out for:

  1. First, as we discovered above, the model en_core_web_sm was missing, which you can get with: python -m spacy download en_core_web_sm.

  2. In the `ai-plugin.json’ manifest file there’s two places where the domain is used, one to tell it where the openapi.yaml file is, and one to tell it where logo url is. Without the openai.yaml file location the plugin won’t install. I replaced them manually, but I believe the DOMAIN environment variable will do that for you if you supply a value. I’ll have to try that.

  3. There also may have been a couple requirements missing, here’s what should have been in the requirements.txt file:

  1. I was also stuck on the fact that the output doesn’t show on the summary page, but if you install the plugin and test it, you get a summary from ChatGPT.

As far as replit goes, I dragged a folder from my local machine and dumped it on the replit file folder which cause me all kinds of headaches. Also I tried to use poetry in replit instead of using pip install -r requirements.txt

That made it work and I learned a few things. Ruv’s repo deserves some stars.

thank you bing chat for pointing me here - a great loop in action - and thanks @ruv for sharing. interestingly, it took several prompts to point here

1 Like

Thanks bing!

(Additional 25 characters)

1 Like

Hi @ruv a quick message to say a big thank you for your work. you are a genius! love this openai-api-plugin :pray:

I think the OpenAi api plug-in might be my best one yet. It’s crazy useful when used with other plugins.

1 Like

I called it " the mother of all plugins" : Vincent Sider on LinkedIn: GitHub - ruvnet/chatgpt-openai-api-plugin: A powerful ChatGPT plugin that…

1 Like