Loading...
    • Developer Guide
    • API Reference
    • MCP
    • Resources
    • Release Notes
    Search...
    ⌘K
    Using the API
    API overviewBeta headersErrors
    Client SDKs
    Client SDKs overviewPython SDKTypeScript SDKJava SDKGo SDKRuby SDKC# SDKPHP SDK
    Messages
    Create a Message
    Count tokens in a Message
    Models
    List Models
    Get a Model
    Beta
    Admin
    Completions
    Create a Text Completion
    Support & configuration
    Rate limitsService tiersVersionsIP addressesSupported regionsOpenAI SDK compatibility
    Console
    Log in
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...
    Loading...

    Solutions

    • AI agents
    • Code modernization
    • Coding
    • Customer support
    • Education
    • Financial services
    • Government
    • Life sciences

    Partners

    • Amazon Bedrock
    • Google Cloud's Vertex AI

    Learn

    • Blog
    • Catalog
    • Courses
    • Use cases
    • Connectors
    • Customer stories
    • Engineering at Anthropic
    • Events
    • Powered by Claude
    • Service partners
    • Startups program

    Company

    • Anthropic
    • Careers
    • Economic Futures
    • Research
    • News
    • Responsible Scaling Policy
    • Security and compliance
    • Transparency

    Learn

    • Blog
    • Catalog
    • Courses
    • Use cases
    • Connectors
    • Customer stories
    • Engineering at Anthropic
    • Events
    • Powered by Claude
    • Service partners
    • Startups program

    Help and security

    • Availability
    • Status
    • Support
    • Discord

    Terms and policies

    • Privacy policy
    • Responsible disclosure policy
    • Terms of service: Commercial
    • Terms of service: Consumer
    • Usage policy
    Client SDKs

    Ruby SDK

    Install and configure the Anthropic Ruby SDK with Sorbet types, streaming helpers, and connection pooling

    The Anthropic Ruby library provides convenient access to the Anthropic REST API from any Ruby 3.2.0+ application. It ships with comprehensive types and docstrings in Yard, RBS, and RBI. The standard library's net/http is used as the HTTP transport, with connection pooling via the connection_pool gem.

    For API feature documentation with code examples, see the API reference. This page covers Ruby-specific SDK features and configuration.

    Installation

    To use this gem, install via Bundler by adding the following to your application's Gemfile:

    gem "anthropic", "~> 1.16.3"

    Requirements

    Ruby 3.2.0 or higher.

    Usage

    require "anthropic"
    
    anthropic = Anthropic::Client.new(
      api_key: ENV["ANTHROPIC_API_KEY"] # This is the default and can be omitted
    )
    
    message = anthropic.messages.create(
      max_tokens: 1024,
      messages: [{role: "user", content: "Hello, Claude"}],
      model: :"claude-opus-4-6"
    )
    
    puts(message.content)

    Streaming

    We provide support for streaming responses using Server-Sent Events (SSE).

    stream = anthropic.messages.stream(
      max_tokens: 1024,
      messages: [{role: "user", content: "Hello, Claude"}],
      model: :"claude-opus-4-6"
    )
    
    stream.each do |message|
      puts(message.type)
    end

    Streaming helpers

    This library provides several conveniences for streaming messages, for example:

    stream = anthropic.messages.stream(
      max_tokens: 1024,
      messages: [{role: :user, content: "Say hello there!"}],
      model: :"claude-opus-4-6"
    )
    
    stream.text.each do |text|
      print(text)
    end

    Streaming with anthropic.messages.stream(...) exposes various helpers including accumulation and SDK-specific events.

    Input Schema and Tool Calling

    The SDK provides helper mechanisms to define structured data classes for tools and let Claude automatically execute them. For detailed documentation on tool use patterns including the tool runner, see Implementing Tool Use.

    class CalculatorInput < Anthropic::BaseModel
      required :lhs, Float
      required :rhs, Float
      required :operator, Anthropic::InputSchema::EnumOf[:+, :-, :*, :/]
    end
    
    class Calculator < Anthropic::BaseTool
      input_schema CalculatorInput
    
      def call(expr)
        expr.lhs.public_send(expr.operator, expr.rhs)
      end
    end
    
    # Automatically handles tool execution loop
    client.beta.messages.tool_runner(
      model: "claude-opus-4-6",
      max_tokens: 1024,
      messages: [{role: "user", content: "What's 15 * 7?"}],
      tools: [Calculator.new]
    ).each_message { puts _1.content }

    Handling errors

    When the library is unable to connect to the API, or if the API returns a non-success status code (i.e., 4xx or 5xx response), a subclass of Anthropic::Errors::APIError will be thrown:

    begin
      message = anthropic.messages.create(
        max_tokens: 1024,
        messages: [{role: "user", content: "Hello, Claude"}],
        model: :"claude-opus-4-6"
      )
    rescue Anthropic::Errors::APIConnectionError => e
      puts("The server could not be reached")
      puts(e.cause)  # an underlying Exception, likely raised within `net/http`
    rescue Anthropic::Errors::RateLimitError => e
      puts("A 429 status code was received; we should back off a bit.")
    rescue Anthropic::Errors::APIStatusError => e
      puts("Another non-200-range status code was received")
      puts(e.status)
    end

    Error codes are as follows:

    CauseError Type
    HTTP 400BadRequestError
    HTTP 401AuthenticationError
    HTTP 403PermissionDeniedError
    HTTP 404NotFoundError
    HTTP 409ConflictError
    HTTP 422UnprocessableEntityError
    HTTP 429RateLimitError
    HTTP >= 500InternalServerError
    Other HTTP errorAPIStatusError
    TimeoutAPITimeoutError
    Network errorAPIConnectionError

    Retries

    Certain errors will be automatically retried 2 times by default, with a short exponential backoff.

    Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict, 429 Rate Limit, >=500 Internal errors, and timeouts will all be retried by default.

    You can use the max_retries option to configure or disable this:

    # Configure the default for all requests:
    anthropic = Anthropic::Client.new(
      max_retries: 0 # default is 2
    )
    
    # Or, configure per-request:
    anthropic.messages.create(
      max_tokens: 1024,
      messages: [{role: "user", content: "Hello, Claude"}],
      model: :"claude-opus-4-6",
      request_options: {max_retries: 5}
    )

    Timeouts

    By default, requests will time out after 600 seconds. You can use the timeout option to configure or disable this:

    # Configure the default for all requests:
    anthropic = Anthropic::Client.new(
      timeout: nil # default is 600
    )
    
    # Or, configure per-request:
    anthropic.messages.create(
      max_tokens: 1024,
      messages: [{role: "user", content: "Hello, Claude"}],
      model: :"claude-opus-4-6",
      request_options: {timeout: 5}
    )

    On timeout, Anthropic::Errors::APITimeoutError is raised.

    Note that requests that time out are retried by default.

    Pagination

    List methods in the Claude API are paginated.

    This library provides auto-paginating iterators with each list response, so you do not have to request successive pages manually:

    page = anthropic.messages.batches.list(limit: 20)
    
    # Fetch single item from page.
    batch = page.data[0]
    puts(batch.id)
    
    # Automatically fetches more pages as needed.
    page.auto_paging_each do |batch|
      puts(batch.id)
    end

    Alternatively, you can use the #next_page? and #next_page methods for more granular control working with pages.

    if page.next_page?
      new_page = page.next_page
      puts(new_page.data[0].id)
    end

    File uploads

    Request parameters that correspond to file uploads can be passed as raw contents, a Pathname instance, StringIO, or more.

    require "pathname"
    
    # Use `Pathname` to send the filename and/or avoid paging a large file into memory:
    file_metadata = anthropic.beta.files.upload(file: Pathname("/path/to/file"))
    
    # Alternatively, pass file contents or a `StringIO` directly:
    file_metadata = anthropic.beta.files.upload(file: File.read("/path/to/file"))
    
    # Or, to control the filename and/or content type:
    file = Anthropic::FilePart.new(File.read("/path/to/file"), filename: "/path/to/file", content_type: "...")
    file_metadata = anthropic.beta.files.upload(file: file)
    
    puts(file_metadata.id)

    Note that you can also pass a raw IO descriptor, but this disables retries, as the library can't be sure if the descriptor is a file or pipe (which cannot be rewound).

    Sorbet

    This library provides comprehensive RBI definitions, and has no dependency on sorbet-runtime.

    You can provide typesafe request parameters like so:

    anthropic.messages.create(
      max_tokens: 1024,
      messages: [Anthropic::MessageParam.new(role: "user", content: "Hello, Claude")],
      model: :"claude-opus-4-6"
    )

    Or, equivalently:

    # Hashes work, but are not typesafe:
    anthropic.messages.create(
      max_tokens: 1024,
      messages: [{role: "user", content: "Hello, Claude"}],
      model: :"claude-opus-4-6"
    )
    
    # You can also splat a full Params class:
    params = Anthropic::MessageCreateParams.new(
      max_tokens: 1024,
      messages: [Anthropic::MessageParam.new(role: "user", content: "Hello, Claude")],
      model: :"claude-opus-4-6"
    )
    anthropic.messages.create(**params)

    Enums

    Since this library does not depend on sorbet-runtime, it cannot provide T::Enum instances. Instead, we provide "tagged symbols" instead, which is always a primitive at runtime:

    # :auto
    puts(Anthropic::MessageCreateParams::ServiceTier::AUTO)
    
    # Revealed type: `T.all(Anthropic::MessageCreateParams::ServiceTier, Symbol)`
    T.reveal_type(Anthropic::MessageCreateParams::ServiceTier::AUTO)

    Enum parameters have a "relaxed" type, so you can either pass in enum constants or their literal value:

    # Using the enum constants preserves the tagged type information:
    anthropic.messages.create(
      service_tier: Anthropic::MessageCreateParams::ServiceTier::AUTO,
      # ...
    )
    
    # Literal values are also permissible:
    anthropic.messages.create(
      service_tier: :auto,
      # ...
    )

    BaseModel

    All parameter and response objects inherit from Anthropic::Internal::Type::BaseModel, which provides several conveniences, including:

    1. All fields, including unknown ones, are accessible with obj[:prop] syntax, and can be destructured with obj => {prop: prop} or pattern-matching syntax.

    2. Structural equivalence for equality; if two API calls return the same values, comparing the responses with == will return true.

    3. Both instances and the classes themselves can be pretty-printed.

    4. Helpers such as #to_h, #deep_to_h, #to_json, and #to_yaml.

    Concurrency and connection pooling

    The Anthropic::Client instances are threadsafe, but are only fork-safe when there are no in-flight HTTP requests.

    Each instance of Anthropic::Client has its own HTTP connection pool with a default size of 99. As such, we recommend instantiating the client once per application in most settings.

    When all available connections from the pool are checked out, requests wait for a new connection to become available, with queue time counting towards the request timeout.

    Unless otherwise specified, other classes in the SDK do not have locks protecting their underlying data structure.

    Making custom or undocumented requests

    Undocumented properties

    You can send undocumented parameters to any endpoint, and read undocumented response properties, like so:

    The extra_ parameters of the same name overrides the documented parameters. For security reasons, ensure these methods are only used with trusted input data.

    message =
      anthropic.messages.create(
        max_tokens: 1024,
        messages: [{role: "user", content: "Hello, Claude"}],
        model: :"claude-opus-4-6",
        request_options: {
          extra_query: {my_query_parameter: value},
          extra_body: {my_body_parameter: value},
          extra_headers: {"my-header": value}
        }
      )
    
    puts(message[:my_undocumented_property])

    Undocumented request params

    If you want to explicitly send an extra param, you can do so with the extra_query, extra_body, and extra_headers under the request_options: parameter when making a request, as seen in the examples above.

    Undocumented endpoints

    To make requests to undocumented endpoints while retaining the benefit of auth, retries, and so on, you can make requests using client.request, like so:

    response = client.request(
      method: :post,
      path: '/undocumented/endpoint',
      query: {"dog": "woof"},
      headers: {"useful-header": "interesting-value"},
      body: {"hello": "world"}
    )

    Platform integrations

    For detailed platform setup guides, see:

    • Amazon Bedrock
    • Google Vertex AI

    Amazon Bedrock

    This library also provides support for the Anthropic Bedrock API if you install this library with the aws-sdk-bedrockruntime gem.

    You can then instantiate a separate Anthropic::BedrockClient class, and use AWS's standard guide for configuring credentials. It has the same API as the base Anthropic::Client class.

    Note that the model ID required is different for Bedrock models, and, depending on the model you want to use, you will need to use either AWS's model ID for Anthropic models -- which can be found in AWS's Bedrock model catalog -- or an inference profile id (e.g. us.anthropic.claude-3-5-haiku-20241022-v1:0 for Claude 3.5 Haiku).

    require "anthropic"
    
    anthropic = Anthropic::BedrockClient.new
    
    message = anthropic.messages.create(
      max_tokens: 1024,
      messages: [
        {
          role: "user",
          content: "Hello, Claude"
        }
      ],
      model: "anthropic.claude-opus-4-6-v1"
    )
    
    puts(message)

    Google Vertex AI

    This library also provides support for the Anthropic Vertex API if you install this library with the googleauth gem.

    You can then import and instantiate a separate Anthropic::VertexClient class, and use Google's guide for configuring Application Default Credentials. It has the same API as the base Anthropic::Client class.

    require "anthropic"
    
    anthropic = Anthropic::VertexClient.new(region: "us-east5", project_id: "my-project-id")
    
    message = anthropic.messages.create(
      max_tokens: 1024,
      messages: [
        {
          role: "user",
          content: "Hello, Claude"
        }
      ],
      model: "claude-opus-4-6"
    )
    
    puts(message)

    Additional resources

    • GitHub repository
    • RubyDoc documentation
    • API reference
    • Streaming guide

    Was this page helpful?

    • Installation
    • Requirements
    • Usage
    • Streaming
    • Streaming helpers
    • Input Schema and Tool Calling
    • Handling errors
    • Retries
    • Timeouts
    • Pagination
    • File uploads
    • Sorbet
    • Enums
    • BaseModel
    • Concurrency and connection pooling
    • Making custom or undocumented requests
    • Undocumented properties
    • Undocumented request params
    • Undocumented endpoints
    • Platform integrations
    • Amazon Bedrock
    • Google Vertex AI
    • Additional resources