_stream() — langchain Function Reference
Architecture documentation for the _stream() function in fake_chat_model.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD bcc2cfdb_efbc_bfa5_edf8_2cba8c4c2bcf["_stream()"] a45c6745_da49_cdd1_be34_45a357868be5["GenericFakeChatModel"] bcc2cfdb_efbc_bfa5_edf8_2cba8c4c2bcf -->|defined in| a45c6745_da49_cdd1_be34_45a357868be5 470a7c50_258d_8644_ae1c_c86fd6540442["_generate()"] bcc2cfdb_efbc_bfa5_edf8_2cba8c4c2bcf -->|calls| 470a7c50_258d_8644_ae1c_c86fd6540442 style bcc2cfdb_efbc_bfa5_edf8_2cba8c4c2bcf fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
libs/langchain/tests/unit_tests/llms/fake_chat_model.py lines 95–199
def _stream(
self,
messages: list[BaseMessage],
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> Iterator[ChatGenerationChunk]:
"""Stream the output of the model."""
chat_result = self._generate(
messages,
stop=stop,
run_manager=run_manager,
**kwargs,
)
if not isinstance(chat_result, ChatResult):
msg = ( # type: ignore[unreachable]
f"Expected generate to return a ChatResult, "
f"but got {type(chat_result)} instead."
)
raise TypeError(msg)
message = chat_result.generations[0].message
if not isinstance(message, AIMessage):
msg = (
f"Expected invoke to return an AIMessage, "
f"but got {type(message)} instead."
)
raise TypeError(msg)
content = message.content
if content:
# Use a regular expression to split on whitespace with a capture group
# so that we can preserve the whitespace in the output.
assert isinstance(content, str)
content_chunks = cast("list[str]", re.split(r"(\s)", content))
for idx, token in enumerate(content_chunks):
chunk = ChatGenerationChunk(
message=AIMessageChunk(id=message.id, content=token),
)
if (
idx == len(content_chunks) - 1
and isinstance(chunk.message, AIMessageChunk)
and not message.additional_kwargs
):
chunk.message.chunk_position = "last"
if run_manager:
run_manager.on_llm_new_token(token, chunk=chunk)
yield chunk
if message.additional_kwargs:
for key, value in message.additional_kwargs.items():
# We should further break down the additional kwargs into chunks
# Special case for function call
if key == "function_call":
for fkey, fvalue in value.items():
if isinstance(fvalue, str):
# Break function call by `,`
fvalue_chunks = cast("list[str]", re.split(r"(,)", fvalue))
for fvalue_chunk in fvalue_chunks:
chunk = ChatGenerationChunk(
message=AIMessageChunk(
id=message.id,
content="",
additional_kwargs={
"function_call": {fkey: fvalue_chunk},
},
),
)
if run_manager:
run_manager.on_llm_new_token(
"",
chunk=chunk, # No token for function call
)
yield chunk
else:
chunk = ChatGenerationChunk(
message=AIMessageChunk(
id=message.id,
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does _stream() do?
_stream() is a function in the langchain codebase, defined in libs/langchain/tests/unit_tests/llms/fake_chat_model.py.
Where is _stream() defined?
_stream() is defined in libs/langchain/tests/unit_tests/llms/fake_chat_model.py at line 95.
What does _stream() call?
_stream() calls 1 function(s): _generate.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free