Home / Class/ ChatOpenAI Class — langchain Architecture

ChatOpenAI Class — langchain Architecture

Architecture documentation for the ChatOpenAI class in base.py from the langchain codebase.

Entity Profile

Dependency Diagram

graph TD
  9f618321_d150_9b02_038c_3e73d4accce8["ChatOpenAI"]
  6acbd332_f387_0c7a_34ab_a3d88a1064cb["BaseChatOpenAI"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|extends| 6acbd332_f387_0c7a_34ab_a3d88a1064cb
  cbac6225_d16a_7d3b_a2eb_91848460cf5a["base.py"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|defined in| cbac6225_d16a_7d3b_a2eb_91848460cf5a
  0c5ceb52_fdcd_c247_231c_b387c4890edd["lc_secrets()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 0c5ceb52_fdcd_c247_231c_b387c4890edd
  fb52697c_6306_6a7d_5747_0a1cb8535dc1["get_lc_namespace()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| fb52697c_6306_6a7d_5747_0a1cb8535dc1
  029f9301_3935_859a_0526_2cc608074c59["lc_attributes()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 029f9301_3935_859a_0526_2cc608074c59
  f1608ede_e281_528e_5435_0487a8492aba["is_lc_serializable()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| f1608ede_e281_528e_5435_0487a8492aba
  6a498ef1_adcf_8dab_1950_b89e71d3f9cc["_default_params()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 6a498ef1_adcf_8dab_1950_b89e71d3f9cc
  da97c4ea_cf99_7c1e_3bcf_0b2d4d776d3a["_get_request_payload()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| da97c4ea_cf99_7c1e_3bcf_0b2d4d776d3a
  1204ff04_17c2_75b2_ecdf_7dc359a5a823["_stream()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 1204ff04_17c2_75b2_ecdf_7dc359a5a823
  70b0fb51_dd74_8920_fdb5_3166aab0ce4e["_astream()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 70b0fb51_dd74_8920_fdb5_3166aab0ce4e
  79f2dbfb_77b4_499e_fa38_a7e5942002e0["with_structured_output()"]
  9f618321_d150_9b02_038c_3e73d4accce8 -->|method| 79f2dbfb_77b4_499e_fa38_a7e5942002e0

Relationship Graph

Source Code

libs/partners/openai/langchain_openai/chat_models/base.py lines 2262–3487

class ChatOpenAI(BaseChatOpenAI):  # type: ignore[override]
    r"""Interface to OpenAI chat model APIs.

    ???+ info "Setup"

        Install `langchain-openai` and set environment variable `OPENAI_API_KEY`.

        ```bash
        pip install -U langchain-openai

        # or using uv
        uv add langchain-openai
        ```

        ```bash
        export OPENAI_API_KEY="your-api-key"
        ```

    ??? info "Key init args — completion params"

        | Param               | Type          | Description                                                                                                 |
        | ------------------- | ------------- | ----------------------------------------------------------------------------------------------------------- |
        | `model`             | `str`         | Name of OpenAI model to use.                                                                                |
        | `temperature`       | `float`       | Sampling temperature.                                                                                       |
        | `max_tokens`        | `int | None`  | Max number of tokens to generate.                                                                           |
        | `logprobs`          | `bool | None` | Whether to return logprobs.                                                                                 |
        | `stream_options`    | `dict`        | Configure streaming outputs, like whether to return token usage when streaming (`{"include_usage": True}`). |
        | `use_responses_api` | `bool | None` | Whether to use the responses API.                                                                           |

        See full list of supported init args and their descriptions below.

    ??? info "Key init args — client params"

        | Param          | Type                                       | Description                                                                         |
        | -------------- | ------------------------------------------ | ----------------------------------------------------------------------------------- |
        | `timeout`      | `float | Tuple[float, float] | Any | None` | Timeout for requests.                                                               |
        | `max_retries`  | `int | None`                               | Max number of retries.                                                              |
        | `api_key`      | `str | None`                               | OpenAI API key. If not passed in will be read from env var `OPENAI_API_KEY`.        |
        | `base_url`     | `str | None`                               | Base URL for API requests. Only specify if using a proxy or service emulator.       |
        | `organization` | `str | None`                               | OpenAI organization ID. If not passed in will be read from env var `OPENAI_ORG_ID`. |

        See full list of supported init args and their descriptions below.

    ??? info "Instantiate"

        Create a model instance with desired params. For example:

        ```python
        from langchain_openai import ChatOpenAI

        model = ChatOpenAI(
            model="...",
            temperature=0,
            max_tokens=None,
            timeout=None,
            max_retries=2,
            # api_key="...",
            # base_url="...",
            # organization="...",
            # other params...
        )
        ```

        See all available params below.

        !!! tip "Preserved params"
            Any param which is not explicitly supported will be passed directly to
            [`openai.OpenAI.chat.completions.create(...)`](https://platform.openai.com/docs/api-reference/chat/create)
            every time to the model is invoked. For example:

            ```python
            from langchain_openai import ChatOpenAI
            import openai

            ChatOpenAI(..., frequency_penalty=0.2).invoke(...)

            # Results in underlying API call of:

            openai.OpenAI(..).chat.completions.create(..., frequency_penalty=0.2)

            # Which is also equivalent to:

Extends

Frequently Asked Questions

What is the ChatOpenAI class?
ChatOpenAI is a class in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is ChatOpenAI defined?
ChatOpenAI is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 2262.
What does ChatOpenAI extend?
ChatOpenAI extends BaseChatOpenAI.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free