BaseChatOpenAI Class — langchain Architecture
Architecture documentation for the BaseChatOpenAI class in base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 6acbd332_f387_0c7a_34ab_a3d88a1064cb["BaseChatOpenAI"] d009a608_c505_bd50_7200_0de8a69ba4b7["BaseChatModel"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|extends| d009a608_c505_bd50_7200_0de8a69ba4b7 17a9b92d_bb83_78d8_7df7_7200745cc17b["AIMessageChunk"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|extends| 17a9b92d_bb83_78d8_7df7_7200745cc17b fcfa55b0_4a86_fa31_a156_3c38c76a0a9b["AIMessage"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|extends| fcfa55b0_4a86_fa31_a156_3c38c76a0a9b cbac6225_d16a_7d3b_a2eb_91848460cf5a["base.py"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|defined in| cbac6225_d16a_7d3b_a2eb_91848460cf5a 0f9e8713_ad3e_88af_1574_38e9efdffe3e["build_extra()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 0f9e8713_ad3e_88af_1574_38e9efdffe3e dc7fe702_c776_3530_b863_30b0d245a1a0["validate_temperature()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| dc7fe702_c776_3530_b863_30b0d245a1a0 15fdc396_a187_274f_f342_120f04d91cda["validate_environment()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 15fdc396_a187_274f_f342_120f04d91cda 5d2fb66d_3cb1_05b8_e247_9afae559cc31["_set_model_profile()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 5d2fb66d_3cb1_05b8_e247_9afae559cc31 de733a63_48d1_4328_ee78_c0dc139bd0e4["_default_params()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| de733a63_48d1_4328_ee78_c0dc139bd0e4 36dabfec_52b9_35bd_7e79_8994ba3fe329["_combine_llm_outputs()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 36dabfec_52b9_35bd_7e79_8994ba3fe329 6eae3cad_56be_a128_93ae_80a9242fb068["_convert_chunk_to_generation_chunk()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 6eae3cad_56be_a128_93ae_80a9242fb068 9b7c067b_3cef_c3d9_ce62_ea89a24cf390["_ensure_sync_client_available()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 9b7c067b_3cef_c3d9_ce62_ea89a24cf390 4e60455d_9607_026a_e62c_daaae8d8fd32["_stream_responses()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| 4e60455d_9607_026a_e62c_daaae8d8fd32 ebb797f8_e145_5d6a_5451_7f2333a7f64f["_astream_responses()"] 6acbd332_f387_0c7a_34ab_a3d88a1064cb -->|method| ebb797f8_e145_5d6a_5451_7f2333a7f64f
Relationship Graph
Source Code
libs/partners/openai/langchain_openai/chat_models/base.py lines 513–2259
class BaseChatOpenAI(BaseChatModel):
"""Base wrapper around OpenAI large language models for chat."""
client: Any = Field(default=None, exclude=True)
async_client: Any = Field(default=None, exclude=True)
root_client: Any = Field(default=None, exclude=True)
root_async_client: Any = Field(default=None, exclude=True)
model_name: str = Field(default="gpt-3.5-turbo", alias="model")
"""Model name to use."""
temperature: float | None = None
"""What sampling temperature to use."""
model_kwargs: dict[str, Any] = Field(default_factory=dict)
"""Holds any model parameters valid for `create` call not explicitly specified."""
openai_api_key: (
SecretStr | None | Callable[[], str] | Callable[[], Awaitable[str]]
) = Field(
alias="api_key", default_factory=secret_from_env("OPENAI_API_KEY", default=None)
)
"""API key to use.
Can be inferred from the `OPENAI_API_KEY` environment variable, or specified as a
string, or sync or async callable that returns a string.
??? example "Specify with environment variable"
```bash
export OPENAI_API_KEY=...
```
```python
from langchain_openai import ChatOpenAI
model = ChatOpenAI(model="gpt-5-nano")
```
??? example "Specify with a string"
```python
from langchain_openai import ChatOpenAI
model = ChatOpenAI(model="gpt-5-nano", api_key="...")
```
??? example "Specify with a sync callable"
```python
from langchain_openai import ChatOpenAI
def get_api_key() -> str:
# Custom logic to retrieve API key
return "..."
model = ChatOpenAI(model="gpt-5-nano", api_key=get_api_key)
```
??? example "Specify with an async callable"
```python
from langchain_openai import ChatOpenAI
async def get_api_key() -> str:
# Custom async logic to retrieve API key
return "..."
model = ChatOpenAI(model="gpt-5-nano", api_key=get_api_key)
```
"""
openai_api_base: str | None = Field(default=None, alias="base_url")
"""Base URL path for API requests, leave blank if not using a proxy or service emulator.""" # noqa: E501
openai_organization: str | None = Field(default=None, alias="organization")
"""Automatically inferred from env var `OPENAI_ORG_ID` if not provided."""
# to support explicit proxy for OpenAI
openai_proxy: str | None = Field(
default_factory=from_env("OPENAI_PROXY", default=None)
Source
Frequently Asked Questions
What is the BaseChatOpenAI class?
BaseChatOpenAI is a class in the langchain codebase, defined in libs/partners/openai/langchain_openai/chat_models/base.py.
Where is BaseChatOpenAI defined?
BaseChatOpenAI is defined in libs/partners/openai/langchain_openai/chat_models/base.py at line 513.
What does BaseChatOpenAI extend?
BaseChatOpenAI extends BaseChatModel, AIMessageChunk, AIMessage.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free