Carrot LabsCarrot Docs
Python SDK

Python SDK

Automatic LLM tracing for OpenAI, Anthropic, and LiteLLM with zero code changes.

The carrot-ai Python SDK captures every LLM call as a structured trace — automatically, in the background, with no changes to your application code.

Installation

pip install carrot-ai

Initialization

import carrot_ai

carrot_ai.init(api_key="sk-...")

Or set the CARROT_API_KEY environment variable and skip the init() call — the SDK initializes automatically on first use.

ParameterTypeDefaultDescription
api_keystrCARROT_API_KEY env varYour Carrot API key
base_urlstrhttps://api.carrotlabs.aiAPI endpoint
flush_intervalfloat5.0Seconds between background flushes
batch_sizeint50Max traces per batch

Wrapping a client

from openai import OpenAI

client = carrot_ai.wrap(OpenAI())

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)
from anthropic import Anthropic

client = carrot_ai.wrap(Anthropic())

response = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}],
)

The wrapped client behaves identically to the original — same methods, same return types, same streaming. Every create() call is captured as a trace in the background.

Supported providers

ProviderClientSyncAsyncStreaming
OpenAIOpenAI / AsyncOpenAI
AnthropicAnthropic / AsyncAnthropic
LiteLLMlitellm.completion / litellm.acompletion

Flushing traces

Traces are sent in the background automatically. For short-lived scripts, call flush() before exit to ensure all traces are submitted:

carrot_ai.flush()

The SDK also flushes automatically on normal interpreter shutdown.

Next steps

On this page