Jump to content

Connect Leeroopedia MCP: Equip your AI agents to search best practices, build plans, verify code, diagnose failures, and look up hyperparameter defaults.

Implementation:Anthropics Anthropic sdk python Cloud Provider Clients

From Leeroopedia
Knowledge Sources
Domains Cloud_Deployment, LLM, Infrastructure
Last Updated 2026-02-15 00:00 GMT

Overview

This page documents the concrete provider client classes used to interact with Claude models through AWS Bedrock, Google Cloud Vertex AI, and Azure AI Foundry. Each class adapts the base Anthropic client to a specific cloud provider while maintaining the same Messages API surface.

Source:

  • Bedrock: src/anthropic/lib/bedrock/_client.py:L131-188
  • Vertex: src/anthropic/lib/vertex/_client.py:L90-148
  • Foundry: src/anthropic/lib/foundry.py:L90-174

Imports

# Bedrock
from anthropic import AnthropicBedrock
from anthropic import AsyncAnthropicBedrock

# Vertex AI
from anthropic import AnthropicVertex
from anthropic import AsyncAnthropicVertex

# Azure AI Foundry
from anthropic import AnthropicFoundry
from anthropic import AsyncAnthropicFoundry

AnthropicBedrock

Constructor Signature

class AnthropicBedrock(BaseBedrockClient[httpx.Client, Stream[Any]], SyncAPIClient):

    def __init__(
        self,
        aws_secret_key: str | None = None,
        aws_access_key: str | None = None,
        aws_region: str | None = None,
        aws_profile: str | None = None,
        aws_session_token: str | None = None,
        base_url: str | httpx.URL | None = None,
        timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
        max_retries: int = DEFAULT_MAX_RETRIES,
        default_headers: Mapping[str, str] | None = None,
        default_query: Mapping[str, object] | None = None,
        http_client: httpx.Client | None = None,
        _strict_response_validation: bool = False,
    ) -> None: ...

Usage Example

from anthropic import AnthropicBedrock

client = AnthropicBedrock(
    aws_access_key="AKIAIOSFODNN7EXAMPLE",
    aws_secret_key="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
    aws_region="us-east-1",
)

message = client.messages.create(
    model="us.anthropic.claude-sonnet-4-20250514-v1:0",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello from Bedrock!"}],
)
print(message.content[0].text)

Exposed Resources

  • client.messages -- Messages instance
  • client.completions -- Completions instance (legacy)
  • client.beta -- Beta instance

Default Base URL

https://bedrock-runtime.{aws_region}.amazonaws.com

Streaming Decoder

Returns AWSEventStreamDecoder via _make_sse_decoder(), which parses AWS EventStream binary protocol using botocore.eventstream.EventStreamBuffer.

Dependencies

boto3 >= 1.28.57, botocore >= 1.31.57

AnthropicVertex

Constructor Signature

class AnthropicVertex(BaseVertexClient[httpx.Client, Stream[Any]], SyncAPIClient):

    def __init__(
        self,
        *,
        region: str | NotGiven = NOT_GIVEN,
        project_id: str | NotGiven = NOT_GIVEN,
        access_token: str | None = None,
        credentials: GoogleCredentials | None = None,
        base_url: str | httpx.URL | None = None,
        timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
        max_retries: int = DEFAULT_MAX_RETRIES,
        default_headers: Mapping[str, str] | None = None,
        default_query: Mapping[str, object] | None = None,
        http_client: httpx.Client | None = None,
        _strict_response_validation: bool = False,
    ) -> None: ...

Usage Example

from anthropic import AnthropicVertex

client = AnthropicVertex(
    region="us-east5",
    project_id="my-gcp-project",
)

message = client.messages.create(
    model="claude-sonnet-4@20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello from Vertex!"}],
)
print(message.content[0].text)

Exposed Resources

  • client.messages -- Messages instance
  • client.beta -- Beta instance

Default Base URL

Dependencies

google-auth[requests] >= 2, < 3

AnthropicFoundry

Constructor Signature

class AnthropicFoundry(BaseFoundryClient[httpx.Client, Stream[Any]], Anthropic):

    def __init__(
        self,
        *,
        resource: str | None = None,
        api_key: str | None = None,
        azure_ad_token_provider: AzureADTokenProvider | None = None,
        base_url: str | None = None,
        timeout: float | Timeout | None | NotGiven = NOT_GIVEN,
        max_retries: int = DEFAULT_MAX_RETRIES,
        default_headers: Mapping[str, str] | None = None,
        default_query: Mapping[str, object] | None = None,
        http_client: httpx.Client | None = None,
        _strict_response_validation: bool = False,
    ) -> None: ...

Usage Example

from anthropic import AnthropicFoundry

# With API key
client = AnthropicFoundry(
    resource="my-resource",
    api_key="my-foundry-api-key",
)

# With Azure AD token provider
from azure.identity import DefaultAzureCredential
credential = DefaultAzureCredential()

client = AnthropicFoundry(
    resource="my-resource",
    azure_ad_token_provider=lambda: credential.get_token(
        "https://cognitiveservices.azure.com/.default"
    ).token,
)

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello from Azure!"}],
)
print(message.content[0].text)

Exposed Resources

  • client.messages -- MessagesFoundry instance (batches disabled)
  • client.beta -- BetaFoundry instance (batch messages disabled)
  • client.models -- Returns None (not supported)

Default Base URL

https://{resource}.services.ai.azure.com/anthropic/

Dependencies

No extra dependencies required (uses core httpx).

Async Variants

Each provider has an async counterpart with identical constructor signatures but using httpx.AsyncClient:

Sync Class Async Class Module
AnthropicBedrock AsyncAnthropicBedrock anthropic.lib.bedrock._client
AnthropicVertex AsyncAnthropicVertex anthropic.lib.vertex._client
AnthropicFoundry AsyncAnthropicFoundry anthropic.lib.foundry
import asyncio
from anthropic import AsyncAnthropicBedrock

async def main():
    client = AsyncAnthropicBedrock(aws_region="us-west-2")
    message = await client.messages.create(
        model="us.anthropic.claude-sonnet-4-20250514-v1:0",
        max_tokens=1024,
        messages=[{"role": "user", "content": "Hello async!"}],
    )
    print(message.content[0].text)

asyncio.run(main())

Common Parameters (All Providers)

Parameter Type Default Description
base_url httpx.URL | None Provider-specific Override the default endpoint URL
timeout httpx.Timeout | None | NotGiven NOT_GIVEN Request timeout in seconds
max_retries int 2 Number of automatic retries on transient errors
default_headers None None Headers merged into every request
default_query None None Query params merged into every request
http_client None None Custom httpx client for transport config

copy() / with_options()

All provider clients support creating derived instances via copy() (aliased as with_options()):

# Create a client with a longer timeout for a specific call
long_timeout_client = client.with_options(timeout=120.0)
message = long_timeout_client.messages.create(...)

Related Pages

Implements Principle

Requires Environment

Page Connections

Double-click a node to navigate. Hold to expand connections.
Principle
Implementation
Heuristic
Environment