Home / File/ custom_tool.py — langchain Source File

custom_tool.py — langchain Source File

Architecture documentation for custom_tool.py, a python file in the langchain codebase. 4 imports, 0 dependents.

File python LangChainCore MessageInterface 4 imports 3 functions

Entity Profile

Dependency Diagram

graph LR
  2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c["custom_tool.py"]
  589b2e2f_c593_ed0a_7906_df4ca371d542["inspect"]
  2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c --> 589b2e2f_c593_ed0a_7906_df4ca371d542
  2bf6d401_816d_d011_3b05_a6114f55ff58["collections.abc"]
  2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c --> 2bf6d401_816d_d011_3b05_a6114f55ff58
  feec1ec4_6917_867b_d228_b134d0ff8099["typing"]
  2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c --> feec1ec4_6917_867b_d228_b134d0ff8099
  121262a1_0bd6_d637_bce3_307ab6b3ecd4["langchain_core.tools"]
  2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c --> 121262a1_0bd6_d637_bce3_307ab6b3ecd4
  style 2bc1c3fe_4564_705b_4b4f_ef73fdf56b3c fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""Custom tool decorator for OpenAI custom tools."""

import inspect
from collections.abc import Awaitable, Callable
from typing import Any

from langchain_core.tools import tool


def _make_wrapped_func(func: Callable[..., str]) -> Callable[..., list[dict[str, Any]]]:
    def wrapped(x: str) -> list[dict[str, Any]]:
        return [{"type": "custom_tool_call_output", "output": func(x)}]

    return wrapped


def _make_wrapped_coroutine(
    coroutine: Callable[..., Awaitable[str]],
) -> Callable[..., Awaitable[list[dict[str, Any]]]]:
    async def wrapped(*args: Any, **kwargs: Any) -> list[dict[str, Any]]:
        result = await coroutine(*args, **kwargs)
        return [{"type": "custom_tool_call_output", "output": result}]

    return wrapped


def custom_tool(*args: Any, **kwargs: Any) -> Any:
    """Decorator to create an OpenAI custom tool.

    Custom tools allow for tools with (potentially long) freeform string inputs.

    See below for an example using LangGraph:

    ```python
    @custom_tool
    def execute_code(code: str) -> str:
        \"\"\"Execute python code.\"\"\"
        return "27"


    model = ChatOpenAI(model="gpt-5", output_version="responses/v1")

    agent = create_react_agent(model, [execute_code])

    input_message = {"role": "user", "content": "Use the tool to calculate 3^3."}
    for step in agent.stream(
        {"messages": [input_message]},
        stream_mode="values",
    ):
        step["messages"][-1].pretty_print()
    ```

    You can also specify a format for a corresponding context-free grammar using the
    `format` kwarg:

    ```python
    from langchain_openai import ChatOpenAI, custom_tool
    from langgraph.prebuilt import create_react_agent

    grammar = \"\"\"
    start: expr
    expr: term (SP ADD SP term)* -> add
    | term
    term: factor (SP MUL SP factor)* -> mul
    | factor
    factor: INT
    SP: " "
    ADD: "+"
    MUL: "*"
    %import common.INT
    \"\"\"

    format = {"type": "grammar", "syntax": "lark", "definition": grammar}

    # highlight-next-line
    @custom_tool(format=format)
    def do_math(input_string: str) -> str:
        \"\"\"Do a mathematical operation.\"\"\"
        return "27"


    model = ChatOpenAI(model="gpt-5", output_version="responses/v1")

    agent = create_react_agent(model, [do_math])

    input_message = {"role": "user", "content": "Use the tool to calculate 3^3."}
    for step in agent.stream(
        {"messages": [input_message]},
        stream_mode="values",
    ):
        step["messages"][-1].pretty_print()
    ```
    """

    def decorator(func: Callable[..., Any]) -> Any:
        metadata = {"type": "custom_tool"}
        if "format" in kwargs:
            metadata["format"] = kwargs.pop("format")
        tool_obj = tool(infer_schema=False, **kwargs)(func)
        tool_obj.metadata = metadata
        tool_obj.description = func.__doc__
        if inspect.iscoroutinefunction(func):
            tool_obj.coroutine = _make_wrapped_coroutine(func)
        else:
            tool_obj.func = _make_wrapped_func(func)
        return tool_obj

    if args and callable(args[0]) and not kwargs:
        return decorator(args[0])

    return decorator

Domain

Subdomains

Dependencies

  • collections.abc
  • inspect
  • langchain_core.tools
  • typing

Frequently Asked Questions

What does custom_tool.py do?
custom_tool.py is a source file in the langchain codebase, written in python. It belongs to the LangChainCore domain, MessageInterface subdomain.
What functions are defined in custom_tool.py?
custom_tool.py defines 3 function(s): _make_wrapped_coroutine, _make_wrapped_func, custom_tool.
What does custom_tool.py depend on?
custom_tool.py imports 4 module(s): collections.abc, inspect, langchain_core.tools, typing.
Where is custom_tool.py in the architecture?
custom_tool.py is located at libs/partners/openai/langchain_openai/tools/custom_tool.py (domain: LangChainCore, subdomain: MessageInterface, directory: libs/partners/openai/langchain_openai/tools).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free