SDK Overview
Foil provides official SDKs for JavaScript and Python. Both SDKs offer core functionality for tracing, logging, and feedback collection, with the JavaScript SDK providing additional features for complex agent workflows.
Feature Comparison
Feature JavaScript Python Basic Logging Yes Yes OpenAI Wrapper Yes Yes OpenAI Agents SDK Yes Yes Distributed Tracing Yes Yes Automatic Span Hierarchy Yes Yes Vercel AI Integration Yes - Tool Call Tracking Yes Yes Signal Recording Yes Yes Streaming Support Yes Yes Multimodal / Media Yes Yes
JavaScript SDK
The JavaScript SDK is our most full-featured SDK, designed for complex AI applications with multiple agents, tools, and nested workflows.
Best for:
Node.js backend services
Next.js applications
Complex agent architectures
Vercel AI integration
npm install @getfoil/foil-js
JavaScript SDK Docs Complete JavaScript SDK documentation
Key Features
Foil - High-level tracing API with automatic span management:
import { Foil } from '@getfoil/foil-js' ;
const foil = new Foil ({
apiKey: 'your-api-key' ,
agentName: 'my-agent' ,
instrumentModules: { openAI: OpenAI },
});
await foil . trace ( async ( ctx ) => {
const messages = [{ role: 'user' , content: 'Search for recent news' }];
// LLM calls are auto-instrumented
let response = await openai . chat . completions . create ({
model: 'gpt-4o' , messages , tools ,
});
// Agentic loop — LLM decides which tools to call
while ( response . choices [ 0 ]. message . tool_calls ) {
const toolMessages = await ctx . executeTools ( response , toolMap );
messages . push ( response . choices [ 0 ]. message , ... toolMessages );
response = await openai . chat . completions . create ({
model: 'gpt-4o' , messages , tools ,
});
}
return response . choices [ 0 ]. message . content ;
});
Framework Integrations:
// Vercel AI SDK
import { createVercelAICallbacks } from '@getfoil/foil-js' ;
const callbacks = createVercelAICallbacks ( tracer );
Python SDK
The Python SDK provides tracing, logging, and feedback collection for Python applications, with automatic OpenAI instrumentation.
Best for:
Python backend services
FastAPI/Flask applications
Jupyter notebooks
LLM integrations
Python SDK Docs Complete Python SDK documentation
Key Features
Foil - High-level tracing API with automatic span management:
from foil import Foil
foil = Foil(
api_key = 'your-api-key' ,
agent_name = 'my-agent' ,
)
def my_workflow ( ctx ):
result = ctx.llm_call( 'gpt-4o' , lambda _ : openai.chat.completions.create(
model = 'gpt-4o' ,
messages = [{ 'role' : 'user' , 'content' : 'Hello!' }],
))
data = ctx.tool( 'search' , lambda _ : search_api(query))
return result.choices[ 0 ].message.content
foil.trace(my_workflow, name = 'my-trace' )
OpenAI Wrapper - Automatic tracing for all OpenAI calls:
from openai import OpenAI
from foil import Foil
client = OpenAI()
foil = Foil( api_key = 'your-api-key' )
wrapped = foil.wrap_openai(client)
response = wrapped.chat.completions.create(
model = 'gpt-4o' ,
messages = [{ 'role' : 'user' , 'content' : 'Hello!' }]
)
Authentication
Both SDKs use API key authentication. Get your API key from the Foil Dashboard .
# Environment variable (recommended)
export FOIL_API_KEY = sk_live_xxx_yyy
Never commit API keys to version control. Use environment variables or a secrets manager.
Debug Mode
Enable debug logging to troubleshoot integration issues:
const foil = new Foil ({
apiKey: 'your-api-key' ,
agentName: 'my-agent' ,
debug: true , // or set FOIL_DEBUG=true
});
foil = Foil(
api_key = 'your-api-key' ,
agent_name = 'my-agent' ,
debug = True , # or set FOIL_DEBUG=true
)