Home / Class/ BetaMessage Class — anthropic-sdk-python Architecture

BetaMessage Class — anthropic-sdk-python Architecture

Architecture documentation for the BetaMessage class in beta_message.py from the anthropic-sdk-python codebase.

Entity Profile

Dependency Diagram

graph TD
  0da6889b_9352_8c02_9f21_a1deb1bdab0f["BetaMessage"]
  17ce5647_6f06_0676_a4a5_e378a3f57cb1["BaseModel"]
  0da6889b_9352_8c02_9f21_a1deb1bdab0f -->|extends| 17ce5647_6f06_0676_a4a5_e378a3f57cb1
  20977b2a_1890_cfe8_7dca_f3846f5fda73["beta_message.py"]
  0da6889b_9352_8c02_9f21_a1deb1bdab0f -->|defined in| 20977b2a_1890_cfe8_7dca_f3846f5fda73

Relationship Graph

Source Code

src/anthropic/types/beta/beta_message.py lines 17–131

class BetaMessage(BaseModel):
    id: str
    """Unique object identifier.

    The format and length of IDs may change over time.
    """

    container: Optional[BetaContainer] = None
    """
    Information about the container used in the request (for the code execution
    tool)
    """

    content: List[BetaContentBlock]
    """Content generated by the model.

    This is an array of content blocks, each of which has a `type` that determines
    its shape.

    Example:

    ```json
    [{ "type": "text", "text": "Hi, I'm Claude." }]
    ```

    If the request input `messages` ended with an `assistant` turn, then the
    response `content` will continue directly from that last turn. You can use this
    to constrain the model's output.

    For example, if the input `messages` were:

    ```json
    [
      {
        "role": "user",
        "content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun"
      },
      { "role": "assistant", "content": "The best answer is (" }
    ]
    ```

    Then the response `content` might be:

    ```json
    [{ "type": "text", "text": "B)" }]
    ```
    """

    context_management: Optional[BetaContextManagementResponse] = None
    """Context management response.

    Information about context management strategies applied during the request.
    """

    model: Model
    """
    The model that will complete your prompt.\n\nSee
    [models](https://docs.anthropic.com/en/docs/models-overview) for additional
    details and options.
    """

    role: Literal["assistant"]
    """Conversational role of the generated message.

    This will always be `"assistant"`.
    """

    stop_reason: Optional[BetaStopReason] = None
    """The reason that we stopped.

    This may be one the following values:

    - `"end_turn"`: the model reached a natural stopping point
    - `"max_tokens"`: we exceeded the requested `max_tokens` or the model's maximum
    - `"stop_sequence"`: one of your provided custom `stop_sequences` was generated
    - `"tool_use"`: the model invoked one or more tools
    - `"pause_turn"`: we paused a long-running turn. You may provide the response
      back as-is in a subsequent request to let the model continue.
    - `"refusal"`: when streaming classifiers intervene to handle potential policy
      violations

Extends

Frequently Asked Questions

What is the BetaMessage class?
BetaMessage is a class in the anthropic-sdk-python codebase, defined in src/anthropic/types/beta/beta_message.py.
Where is BetaMessage defined?
BetaMessage is defined in src/anthropic/types/beta/beta_message.py at line 17.
What does BetaMessage extend?
BetaMessage extends BaseModel.

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free