Can a plugin(installed in the ChatGPT) be used in the OpenAI API?

Dear community.
I’m not invited yet, so I can’t test by myself.
If a plugin was installed in my ChatGPT account, Could it work in my OpenAI API(GPT-4) also with the same account?
For example, If I installed the Web Surf plugin in ChatGPT, would another app which is using OpenAI API have the power of that plugin? would it reply to me the complements with the result of Web Surf?

No, plugins are only ChatGPT feature, not currently planned for API

Thanks for the reply.
Hmmm… it means there is no easy way to feed the real-time data to the OpenAI API yet.
I think fine-tuning is just for static data and behaviours and not for dynamic/real-time data. So, If I need a real-time data feed, I have to make a prompt including the data(json, yaml, whatever) each time. sounds somewhat inefficient.

1 Like

Yeah including it in the prompt is the current best method. Take a look at embeddings tutorials and projects like LangChain for examples of how to best find relevant data to include in prompt.

2 Likes

Hey, I was in assumption that API CAN handle this? I have the following in my django application and it seems to do the job, at least when using “older” models, here text-davinci-003

if request.method == 'GET':
    newsquest = request.GET.get("whatsup", '')
    source_1 = 'fiscalnote.list_biden_remarks_remarks_biden__get latest five remarks'
    prompt = f"{newsquest} Consider utilizing {source_1}. please reply in html " \
             "format and add tables when applicable. Add links to documents if they are available."
    response = openai.Completion.create(
        engine=model_completions,
        prompt=prompt,
        n=1,
        max_tokens=1024
    )
    result = response.choices[0].text
    return HttpResponse(f'{result}')
return render(request, "index.html")

I have to draw this back, even it hurts a bit. You are right novaphil . I was wondering/hoping that GPT was just hallucinating on these, but seems it does not work (yet), yeah. too bad.