Carrot LabsCarrot Docs

Tracing

Capture structured traces from your LLM calls for observability and custom model training.

Traces capture the full context of every LLM call in your application — what you sent, what you got back, how long it took, and how many tokens were used. This data powers both observability and custom model training.

Two ways to capture traces

Wrap your existing LLM client and traces are captured automatically:

import carrot_ai
from openai import OpenAI

carrot_ai.init(api_key="sk-...")
client = carrot_ai.wrap(OpenAI())

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello!"}],
)

The SDK works with OpenAI, Anthropic, and LiteLLM. Traces are sent in the background with no impact on your application. See Python SDK for full details.

2. Inference header

If you're already using the Carrot inference API for your custom model, add the X-Carrot-Trace: true header to capture traces automatically:

client = OpenAI(
    base_url="https://api.carrotlabs.ai/v1",
    api_key="sk-...",
    default_headers={"X-Carrot-Trace": "true"},
)

The header approach only works for requests sent through the Carrot API. To trace calls to OpenAI, Anthropic, or other providers directly, use the SDK.

What gets captured

Each trace includes:

FieldDescription
inputMessages, model, temperature, and other request params
outputAssistant response (content + tool calls)
metadata.modelModel name
metadata.input_tokensPrompt token count
metadata.output_tokensCompletion token count
metadata.latency_msEnd-to-end call duration
status"success" or "error"
started_at / ended_atTimestamps

Viewing traces

Open the Traces page in the dashboard to browse, filter, and inspect traces. Click any trace to see the full input/output and metadata.

Filter by:

  • Model name
  • Status (success / error)
  • Tags
  • Date range

Custom trace ingestion

For advanced use cases, you can send traces directly via the REST API:

curl -X POST https://api.carrotlabs.ai/v1/traces/ingest \
  -H "Authorization: Bearer sk-..." \
  -H "Content-Type: application/json" \
  -d '{
    "traces": [{
      "name": "my-agent",
      "input": {"messages": [{"role": "user", "content": "Hello"}]},
      "output": {"message": {"role": "assistant", "content": "Hi!"}},
      "metadata": {"model": "gpt-4o", "input_tokens": 10, "output_tokens": 5},
      "status": "success"
    }]
  }'

See Trace Ingest API for the full specification.

On this page