GPT-3 for enterprises

Hi All,

I have integrated GPT-3 with a chatbot and a UiPath robot. A question from the chatbot is forwarded to GPT-3 via a UiPath robot. Recorded a short clip on this, which you can watch here:

I have also talked about this set-up in my blog @ rpaimoments.com .

I am just wondering if there are any plans to make the model more appropriate to be used in an enterprise platform with a more limited and focused content, since in an enterprise, you might want to answer some questions in a more formal and limited way. Is there any control mechanism to achieve this ?
I am asking this since a very basic question I asked about a company, returned a totally inappropriate answer. I would really appreciate to get some guidance on this. Thanks a lot in advance.

I’d suggest checking out the Answers endpoint, which lets you query a set of documents that you provide, with a temperature that defaults to 0, so it’s unlikely to come up with false or irrelevant information. Additionally, you can show the API how to say “unknown.”

2 Likes

Thanks a lot Joey for the guidance. Will definitely try this.

When it comes to the enterprise approach, like limiting the context somehow, is there any way to do this ?

By giving it documents (the specialised endpoints), you’re limiting context.

1 Like

Could you please provide an example of how to get answers API to answer unknown?
(I guess it’s done by providing context and Q:A where one example covers the unknown example)

Exactly!

The following code returns “Unknown”

print(openai.Answer.create
search_model="ada",
model="davinci",
question="What color is the boat?",
documents=["The house is blue."],
examples_context="The house is red and the grass is yellow.",
examples=[
["What color is the house?", "red"],
["What color is the grass?", "yellow"],
["What color is the boat?", "Unknown"]
],
temperature=0,
max_rerank=10,
max_tokens=5,
return_prompt=True,
stop=["\n", "<|endoftext|>"]
))
1 Like