_stream() — langchain Function Reference
Architecture documentation for the _stream() function in fake_chat_models.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 576f6581_ce6e_5e34_ec34_a77e605e8814["_stream()"] 0f829642_e361_67a1_cbca_f82839bbbd31["GenericFakeChatModel"] 576f6581_ce6e_5e34_ec34_a77e605e8814 -->|defined in| 0f829642_e361_67a1_cbca_f82839bbbd31 8b0f62ae_050c_e797_302b_3cbbf84976ba["_stream()"] 8b0f62ae_050c_e797_302b_3cbbf84976ba -->|calls| 576f6581_ce6e_5e34_ec34_a77e605e8814 6aab4b64_b0dd_9e21_9fb0_f35c85a54be9["_generate()"] 576f6581_ce6e_5e34_ec34_a77e605e8814 -->|calls| 6aab4b64_b0dd_9e21_9fb0_f35c85a54be9 8b0f62ae_050c_e797_302b_3cbbf84976ba["_stream()"] 576f6581_ce6e_5e34_ec34_a77e605e8814 -->|calls| 8b0f62ae_050c_e797_302b_3cbbf84976ba eec0b1c8_4eca_a0d5_55a8_e9909486cbb7["_generate()"] 576f6581_ce6e_5e34_ec34_a77e605e8814 -->|calls| eec0b1c8_4eca_a0d5_55a8_e9909486cbb7 style 576f6581_ce6e_5e34_ec34_a77e605e8814 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/core/langchain_core/language_models/fake_chat_models.py lines 266–367
def _stream(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> Iterator[ChatGenerationChunk]:
chat_result = self._generate(
messages, stop=stop, run_manager=run_manager, **kwargs
)
if not isinstance(chat_result, ChatResult):
msg = (
f"Expected generate to return a ChatResult, "
f"but got {type(chat_result)} instead."
)
raise ValueError(msg) # noqa: TRY004
message = chat_result.generations[0].message
if not isinstance(message, AIMessage):
msg = (
f"Expected invoke to return an AIMessage, "
f"but got {type(message)} instead."
)
raise ValueError(msg) # noqa: TRY004
content = message.content
if content:
# Use a regular expression to split on whitespace with a capture group
# so that we can preserve the whitespace in the output.
if not isinstance(content, str):
msg = "Expected content to be a string."
raise ValueError(msg)
content_chunks = cast("list[str]", re.split(r"(\s)", content))
for idx, token in enumerate(content_chunks):
chunk = ChatGenerationChunk(
message=AIMessageChunk(content=token, id=message.id)
)
if (
idx == len(content_chunks) - 1
and isinstance(chunk.message, AIMessageChunk)
and not message.additional_kwargs
):
chunk.message.chunk_position = "last"
if run_manager:
run_manager.on_llm_new_token(token, chunk=chunk)
yield chunk
if message.additional_kwargs:
for key, value in message.additional_kwargs.items():
# We should further break down the additional kwargs into chunks
# Special case for function call
if key == "function_call":
for fkey, fvalue in value.items():
if isinstance(fvalue, str):
# Break function call by `,`
fvalue_chunks = cast("list[str]", re.split(r"(,)", fvalue))
for fvalue_chunk in fvalue_chunks:
chunk = ChatGenerationChunk(
message=AIMessageChunk(
id=message.id,
content="",
additional_kwargs={
"function_call": {fkey: fvalue_chunk}
},
)
)
if run_manager:
run_manager.on_llm_new_token(
"",
chunk=chunk, # No token for function call
)
yield chunk
else:
chunk = ChatGenerationChunk(
message=AIMessageChunk(
id=message.id,
content="",
Domain
Subdomains
Called By
Source
Frequently Asked Questions
What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/core/langchain_core/language_models/fake_chat_models.py.
Where is _stream() defined?
_stream() is defined in libs/core/langchain_core/language_models/fake_chat_models.py at line 266.
What does _stream() call?
_stream() calls 3 function(s): _generate, _generate, _stream.
What calls _stream()?
_stream() is called by 1 function(s): _stream.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free