Error: InvalidRequestError: This is not a chat model and thus not supported in the v1/chat/completions endpoint

Everytime I run the following code with a different model like “text-davinci-003” instead of “gpt-3.5-turbo” I am receiving the following error.: InvalidRequestError: This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?

Here is my code:

from langchain import LLMMathChain, OpenAI, SerpAPIWrapper, SQLDatabase
from langchain_experimental.sql import SQLDatabaseChain
from langchain.agents import initialize_agent, Tool
from langchain.agents import AgentType
from langchain.chat_models import ChatOpenAI
from decouple import config

oak = openai_secret_key (witheld for security)
openai_api_key = oak

# create LLM model

llm = ChatOpenAI(temperature=0, model=“babbage-002”, openai_api_key=openai_api_key)

# create an LLM math tool

llm_math_chain = LLMMathChain.from_llm(llm=llm, verbose=True)

# connect to our database

db = SQLDatabase.from_uri(“sqlite:///maids.db”)

# create the database chain

db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)

tools = [
Tool(
name=“Yellowsense_Database”,
func=db_chain.run,
description=“useful for when you need to answer questions about maids/cooks/nannies.”
)
]

# creating the agent

agent = initialize_agent(
tools=tools, llm=llm, agent=AgentType.OPENAI_FUNCTIONS, verbose=True)

user_input = input(
“”“You can now chat with your database.
Please enter your question or type ‘quit’ to exit: “””
)


agent.run(user_input)

Please help and let me know what all models I can alternatively use for my gen ai chatbot to get the best results. I basically want it to recommend maids/cooks/nannies to me based on the info stored in an sql database

Welcome to the forum.

I’d suggest taking a look at model endpoint compatibility in the docs. Basically, there’s two classes of models - completion and chat completion…

/v1/chat/completions gpt-4, gpt-4-0613, gpt-4-32k, gpt-4-32k-0613, gpt-3.5-turbo, gpt-3.5-turbo-0613, gpt-3.5-turbo-16k, gpt-3.5-turbo-16k-0613
/v1/completions (Legacy) gpt-3.5-turbo-instruct, babbage-002, davinci-002

Feel free to come back and let us know if you can’t get it sorted.

2 Likes

Everytime I run the following code with a different model like “text-davinci-003” instead of “gpt-3.5-turbo” I am receiving the following error.: InvalidRequestError: This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?

Here is my code:
from langchain import LLMMathChain, OpenAI, SerpAPIWrapper, SQLDatabase
from langchain_experimental.sql import SQLDatabaseChain
from langchain.agents import initialize_agent, Tool
from langchain.agents import AgentType
from langchain.chat_models import ChatOpenAI
from decouple import config

oak = openai_secret_key (witheld for security)
openai_api_key = oak

create LLM model

llm = ChatOpenAI(temperature=0, model=“babbage-002”, openai_api_key=openai_api_key)

create an LLM math tool

llm_math_chain = LLMMathChain.from_llm(llm=llm, verbose=True)

connect to our database

db = SQLDatabase.from_uri(“sqlite:///maids.db”)

create the database chain

db_chain = SQLDatabaseChain.from_llm(llm, db, verbose=True)

tools = [
Tool(
name=“Yellowsense_Database”,
func=db_chain.run,
description=“useful for when you need to answer questions about maids/cooks/nannies.”
)
]

creating the agent

agent = initialize_agent(
tools=tools, llm=llm, agent=AgentType.OPENAI_FUNCTIONS, verbose=True)

user_input = input(
“”“You can now chat with your database.
Please enter your question or type ‘quit’ to exit: “””
)

ask the LLM a question

agent.run(user_input) Please help and let me know what all models I can alternatively use for my gen ai chatbot to get the best results. I basically want it to recommend maids/cooks/nannies to me based on the info stored in an sql database

to tl:dr; what @PaulBellow said:

text-davinci is not a chat model. you can think of text-davinci and the other completion models as autocomplete.

these models have simpler APIs than chat AI APIs. they just take a string instead of a message array, hence the response: “This is not a chat model and thus not supported in the v1/chat/completions endpoint”

edit: just thought of this analogy:
you’re trying to plug an ethernet cable into a phone line.

1 Like