Python SDK
OpenAI
Automatic tracing for OpenAI chat completions, streaming, and tool calls.
Setup
import carrot_ai
from openai import OpenAI
carrot_ai.init(api_key="sk-...")
client = carrot_ai.wrap(OpenAI())Chat completions
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the capital of France?"},
],
temperature=0.7,
)
print(response.choices[0].message.content)Streaming
Streaming works transparently — iterate as usual and the trace is captured when the stream finishes:
stream = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True,
)
for chunk in stream:
if chunk.choices and chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)Tool calls
Tool calls are captured automatically, including function names, arguments, and IDs:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "What's the weather in SF?"}],
tools=[{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {
"type": "object",
"properties": {"city": {"type": "string"}},
},
},
}],
)Async client
from openai import AsyncOpenAI
client = carrot_ai.wrap(AsyncOpenAI())
async def main():
response = await client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)With @trace for pipelines
Combine wrap() with @trace to capture multi-step workflows:
@carrot_ai.trace
def summarize_and_translate(text: str, language: str) -> str:
summary = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "Summarize concisely."},
{"role": "user", "content": text},
],
).choices[0].message.content
translation = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": f"Translate to {language}."},
{"role": "user", "content": summary},
],
).choices[0].message.content
return translationBoth LLM calls appear as child traces under the parent summarize_and_translate trace.
Default tags
Add default tags to every trace from a specific client:
client = carrot_ai.wrap(OpenAI(), tags=["production", "gpt-4o"])