ModelWithAsyncStream Class — langchain Architecture
Architecture documentation for the ModelWithAsyncStream class in test_base.py from the langchain codebase.
Entity Profile
Dependency Diagram
graph TD 541a2be1_b6f8_e29e_5b2a_445884e6ea0a["ModelWithAsyncStream"] 76008743_94c2_792b_9e4f_e1278ae47565["BaseLLM"] 541a2be1_b6f8_e29e_5b2a_445884e6ea0a -->|extends| 76008743_94c2_792b_9e4f_e1278ae47565 4404310a_b16c_e9e6_1de2_90e5ea6f3ce3["test_base.py"] 541a2be1_b6f8_e29e_5b2a_445884e6ea0a -->|defined in| 4404310a_b16c_e9e6_1de2_90e5ea6f3ce3 c9eccd89_0b62_c4de_7990_cb2dc8a08076["_generate()"] 541a2be1_b6f8_e29e_5b2a_445884e6ea0a -->|method| c9eccd89_0b62_c4de_7990_cb2dc8a08076 bbd70c92_2349_7e67_26af_2e8aabb3c962["_astream()"] 541a2be1_b6f8_e29e_5b2a_445884e6ea0a -->|method| bbd70c92_2349_7e67_26af_2e8aabb3c962 aa6247f3_cec7_2a37_7f62_c0ae66b300cb["_llm_type()"] 541a2be1_b6f8_e29e_5b2a_445884e6ea0a -->|method| aa6247f3_cec7_2a37_7f62_c0ae66b300cb
Relationship Graph
Source Code
libs/core/tests/unit_tests/language_models/llms/test_base.py lines 205–230
class ModelWithAsyncStream(BaseLLM):
def _generate(
self,
prompts: list[str],
stop: list[str] | None = None,
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> LLMResult:
"""Top Level call."""
raise NotImplementedError
@override
async def _astream(
self,
prompt: str,
stop: list[str] | None = None,
run_manager: AsyncCallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> AsyncIterator[GenerationChunk]:
"""Stream the output of the model."""
yield GenerationChunk(text="a")
yield GenerationChunk(text="b")
@property
def _llm_type(self) -> str:
return "fake-chat-model"
Extends
Source
Frequently Asked Questions
What is the ModelWithAsyncStream class?
ModelWithAsyncStream is a class in the langchain codebase, defined in libs/core/tests/unit_tests/language_models/llms/test_base.py.
Where is ModelWithAsyncStream defined?
ModelWithAsyncStream is defined in libs/core/tests/unit_tests/language_models/llms/test_base.py at line 205.
What does ModelWithAsyncStream extend?
ModelWithAsyncStream extends BaseLLM.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free