_astream() — langchain Function Reference
Architecture documentation for the _astream() function in chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 8028f890_4975_51cc_2409_45db9b39971a["_astream()"] 977b57b2_5d0e_bcf4_a43e_b52857105005["ChatAnthropic"] 8028f890_4975_51cc_2409_45db9b39971a -->|defined in| 977b57b2_5d0e_bcf4_a43e_b52857105005 954f0cfe_731b_ac1b_0145_b5a3d210030b["_get_request_payload()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| 954f0cfe_731b_ac1b_0145_b5a3d210030b e04429ee_bd93_2a97_005f_7ce3ae9f3e4f["_acreate()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| e04429ee_bd93_2a97_005f_7ce3ae9f3e4f 8f4b2740_7c71_b933_6989_ad2d7f8a4b42["_tools_in_params()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| 8f4b2740_7c71_b933_6989_ad2d7f8a4b42 56815eb0_82df_1181_dbf9_0d0e97e40514["_documents_in_params()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| 56815eb0_82df_1181_dbf9_0d0e97e40514 d0043912_79ce_fb22_d24e_6234ef57df26["_thinking_in_params()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| d0043912_79ce_fb22_d24e_6234ef57df26 aea38cf0_2da9_3eec_9bfd_04f393036557["_compact_in_params()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| aea38cf0_2da9_3eec_9bfd_04f393036557 01106d9b_3ac9_a2a5_056c_a55bc89d961b["_make_message_chunk_from_anthropic_event()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| 01106d9b_3ac9_a2a5_056c_a55bc89d961b b9fd12bf_ff30_9c6b_c6e6_df17524f5c55["_handle_anthropic_bad_request()"] 8028f890_4975_51cc_2409_45db9b39971a -->|calls| b9fd12bf_ff30_9c6b_c6e6_df17524f5c55 style 8028f890_4975_51cc_2409_45db9b39971a fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/partners/anthropic/langchain_anthropic/chat_models.py lines 1294–1329
async def _astream(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
run_manager: AsyncCallbackManagerForLLMRun | None = None,
*,
stream_usage: bool | None = None,
**kwargs: Any,
) -> AsyncIterator[ChatGenerationChunk]:
if stream_usage is None:
stream_usage = self.stream_usage
kwargs["stream"] = True
payload = self._get_request_payload(messages, stop=stop, **kwargs)
try:
stream = await self._acreate(payload)
coerce_content_to_string = (
not _tools_in_params(payload)
and not _documents_in_params(payload)
and not _thinking_in_params(payload)
and not _compact_in_params(payload)
)
block_start_event = None
async for event in stream:
msg, block_start_event = _make_message_chunk_from_anthropic_event(
event,
stream_usage=stream_usage,
coerce_content_to_string=coerce_content_to_string,
block_start_event=block_start_event,
)
if msg is not None:
chunk = ChatGenerationChunk(message=msg)
if run_manager and isinstance(msg.content, str):
await run_manager.on_llm_new_token(msg.content, chunk=chunk)
yield chunk
except anthropic.BadRequestError as e:
_handle_anthropic_bad_request(e)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does _astream() do?
_astream() is a function in the langchain codebase, defined in libs/partners/anthropic/langchain_anthropic/chat_models.py.
Where is _astream() defined?
_astream() is defined in libs/partners/anthropic/langchain_anthropic/chat_models.py at line 1294.
What does _astream() call?
_astream() calls 8 function(s): _acreate, _compact_in_params, _documents_in_params, _get_request_payload, _handle_anthropic_bad_request, _make_message_chunk_from_anthropic_event, _thinking_in_params, _tools_in_params.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free