CompletionCreateParamsBase Class — anthropic-sdk-python Architecture
Architecture documentation for the CompletionCreateParamsBase class in completion_create_params.py from the anthropic-sdk-python codebase.
Entity Profile
Dependency Diagram
graph TD 510f0967_0fa3_48f5_ebde_cd8fbe93e951["CompletionCreateParamsBase"] 71316ac3_9f59_4b57_be15_e422c1f1309b["completion_create_params.py"] 510f0967_0fa3_48f5_ebde_cd8fbe93e951 -->|defined in| 71316ac3_9f59_4b57_be15_e422c1f1309b
Relationship Graph
Source Code
src/anthropic/types/completion_create_params.py lines 26–101
class CompletionCreateParamsBase(TypedDict, total=False):
max_tokens_to_sample: Required[int]
"""The maximum number of tokens to generate before stopping.
Note that our models may stop _before_ reaching this maximum. This parameter
only specifies the absolute maximum number of tokens to generate.
"""
model: Required[ModelParam]
"""
The model that will complete your prompt.\n\nSee
[models](https://docs.anthropic.com/en/docs/models-overview) for additional
details and options.
"""
prompt: Required[str]
"""The prompt that you want Claude to complete.
For proper response generation you will need to format your prompt using
alternating `\n\nHuman:` and `\n\nAssistant:` conversational turns. For example:
```
"\n\nHuman: {userQuestion}\n\nAssistant:"
```
See [prompt validation](https://docs.claude.com/en/api/prompt-validation) and
our guide to [prompt design](https://docs.claude.com/en/docs/intro-to-prompting)
for more details.
"""
metadata: MetadataParam
"""An object describing metadata about the request."""
stop_sequences: SequenceNotStr[str]
"""Sequences that will cause the model to stop generating.
Our models stop on `"\n\nHuman:"`, and may include additional built-in stop
sequences in the future. By providing the stop_sequences parameter, you may
include additional strings that will cause the model to stop generating.
"""
temperature: float
"""Amount of randomness injected into the response.
Defaults to `1.0`. Ranges from `0.0` to `1.0`. Use `temperature` closer to `0.0`
for analytical / multiple choice, and closer to `1.0` for creative and
generative tasks.
Note that even with `temperature` of `0.0`, the results will not be fully
deterministic.
"""
top_k: int
"""Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses.
[Learn more technical details here](https://towardsdatascience.com/how-to-sample-from-language-models-682bceb97277).
Recommended for advanced use cases only. You usually only need to use
`temperature`.
"""
top_p: float
"""Use nucleus sampling.
In nucleus sampling, we compute the cumulative distribution over all the options
for each subsequent token in decreasing probability order and cut it off once it
reaches a particular probability specified by `top_p`. You should either alter
`temperature` or `top_p`, but not both.
Recommended for advanced use cases only. You usually only need to use
`temperature`.
"""
betas: Annotated[List[AnthropicBetaParam], PropertyInfo(alias="anthropic-beta")]
"""Optional header to specify the beta version(s) you want to use."""
Domain
Source
Frequently Asked Questions
What is the CompletionCreateParamsBase class?
CompletionCreateParamsBase is a class in the anthropic-sdk-python codebase, defined in src/anthropic/types/completion_create_params.py.
Where is CompletionCreateParamsBase defined?
CompletionCreateParamsBase is defined in src/anthropic/types/completion_create_params.py at line 26.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free