create_openai_functions_agent() — langchain Function Reference
Architecture documentation for the create_openai_functions_agent() function in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD ae6ff602_535e_4274_9860_cbc4804a314e["create_openai_functions_agent()"] 8878ac65_f307_870e_1a83_8610e50355b5["base.py"] ae6ff602_535e_4274_9860_cbc4804a314e -->|defined in| 8878ac65_f307_870e_1a83_8610e50355b5 style ae6ff602_535e_4274_9860_cbc4804a314e fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/langchain_classic/agents/openai_functions_agent/base.py lines 287–382
def create_openai_functions_agent(
llm: BaseLanguageModel,
tools: Sequence[BaseTool],
prompt: ChatPromptTemplate,
) -> Runnable:
"""Create an agent that uses OpenAI function calling.
Args:
llm: LLM to use as the agent. Should work with OpenAI function calling,
so either be an OpenAI model that supports that or a wrapper of
a different model that adds in equivalent support.
tools: Tools this agent has access to.
prompt: The prompt to use. See Prompt section below for more.
Returns:
A Runnable sequence representing an agent. It takes as input all the same input
variables as the prompt passed in does. It returns as output either an
AgentAction or AgentFinish.
Raises:
ValueError: If `agent_scratchpad` is not in the prompt.
Example:
Creating an agent with no memory
```python
from langchain_openai import ChatOpenAI
from langchain_classic.agents import (
AgentExecutor,
create_openai_functions_agent,
)
from langchain_classic import hub
prompt = hub.pull("hwchase17/openai-functions-agent")
model = ChatOpenAI()
tools = ...
agent = create_openai_functions_agent(model, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools)
agent_executor.invoke({"input": "hi"})
# Using with chat history
from langchain_core.messages import AIMessage, HumanMessage
agent_executor.invoke(
{
"input": "what's my name?",
"chat_history": [
HumanMessage(content="hi! my name is bob"),
AIMessage(content="Hello Bob! How can I assist you today?"),
],
}
)
```
Prompt:
The agent prompt must have an `agent_scratchpad` key that is a
`MessagesPlaceholder`. Intermediate agent actions and tool output
messages will be passed in here.
Here's an example:
```python
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assistant"),
MessagesPlaceholder("chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
]
)
```
"""
if "agent_scratchpad" not in (
prompt.input_variables + list(prompt.partial_variables)
):
msg = (
Domain
Subdomains
Source
Frequently Asked Questions
What does create_openai_functions_agent() do?
create_openai_functions_agent() is a function in the langchain codebase, defined in libs/langchain/langchain_classic/agents/openai_functions_agent/base.py.
Where is create_openai_functions_agent() defined?
create_openai_functions_agent() is defined in libs/langchain/langchain_classic/agents/openai_functions_agent/base.py at line 287.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free