Home / File/ string.py — langchain Source File

string.py — langchain Source File

Architecture documentation for string.py, a python file in the langchain codebase. 2 imports, 0 dependents.

Entity Profile

Dependency Diagram

graph LR
  93668d8e_475c_3ca9_cce8_dafea2eccf1c["string.py"]
  91721f45_4909_e489_8c1f_084f8bd87145["typing_extensions"]
  93668d8e_475c_3ca9_cce8_dafea2eccf1c --> 91721f45_4909_e489_8c1f_084f8bd87145
  5d37c56a_542f_e309_4416_d58ad5f08a28["langchain_core.output_parsers.transform"]
  93668d8e_475c_3ca9_cce8_dafea2eccf1c --> 5d37c56a_542f_e309_4416_d58ad5f08a28
  style 93668d8e_475c_3ca9_cce8_dafea2eccf1c fill:#6366f1,stroke:#818cf8,color:#fff

Relationship Graph

Source Code

"""String output parser."""

from typing_extensions import override

from langchain_core.output_parsers.transform import BaseTransformOutputParser


class StrOutputParser(BaseTransformOutputParser[str]):
    """Extract text content from model outputs as a string.

    Converts model outputs (such as `AIMessage` or `AIMessageChunk` objects) into plain
    text strings. It's the simplest output parser and is useful when you need string
    responses for downstream processing, display, or storage.

    Supports streaming, yielding text chunks as they're generated by the model.

    Example:
        ```python
        from langchain_core.output_parsers import StrOutputParser
        from langchain_openai import ChatOpenAI

        model = ChatOpenAI(model="gpt-4o")
        parser = StrOutputParser()

        # Get string output from a model
        message = model.invoke("Tell me a joke")
        result = parser.invoke(message)
        print(result)  # plain string

        # With streaming - use transform() to process a stream
        stream = model.stream("Tell me a story")
        for chunk in parser.transform(stream):
            print(chunk, end="", flush=True)
        ```
    """

    @classmethod
    def is_lc_serializable(cls) -> bool:
        """`StrOutputParser` is serializable.

        Returns:
            `True`
        """
        return True

    @classmethod
    def get_lc_namespace(cls) -> list[str]:
        """Get the namespace of the LangChain object.

        Returns:
            `["langchain", "schema", "output_parser"]`
        """
        return ["langchain", "schema", "output_parser"]

    @property
    def _type(self) -> str:
        """Return the output parser type for serialization."""
        return "default"

    @override
    def parse(self, text: str) -> str:
        """Returns the input text with no changes."""
        return text

Domain

Subdomains

Classes

Dependencies

  • langchain_core.output_parsers.transform
  • typing_extensions

Frequently Asked Questions

What does string.py do?
string.py is a source file in the langchain codebase, written in python. It belongs to the OutputParsing domain, StreamingParsers subdomain.
What does string.py depend on?
string.py imports 2 module(s): langchain_core.output_parsers.transform, typing_extensions.
Where is string.py in the architecture?
string.py is located at libs/core/langchain_core/output_parsers/string.py (domain: OutputParsing, subdomain: StreamingParsers, directory: libs/core/langchain_core/output_parsers).

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free