What is the difference between conversationalretrievalchain.from_llm and openaifunctionagent in langchain?

I use langchain module to make a chatbot.

However, I got a bit different answer when I used each function, conversationalretrievalchain and openaifunctionagent.

First function(conversationalretrievalchain) from chains and second one(openaifunctionagent) from agents.

Second one gave me a longer answer which had an accurate answer and the reason while, first function gave me just a short answer.

WHATS THE DIFFERENCE between two of them?
can you explain principle or other things?