LangSmith Exporter for Mastra Observability
Trust: ★★★☆☆ (0.90) · 0 validations · developer_reference
Published: 2026-05-10 · Source: crawler_authoritative
Tình huống
Guide for configuring LangSmith tracing exporter in Mastra applications to send traces to LangChain’s monitoring platform.
Insight
The LangSmith exporter (@mastra/langsmith package) integrates with Mastra’s Observability system to send traces to LangSmith for monitoring and evaluating LLM applications. It supports zero-config setup via environment variables (LANGSMITH_API_KEY, LANGCHAIN_PROJECT, LANGSMITH_BASE_URL) or explicit configuration via constructor options. Constructor accepts: apiKey (required), apiUrl (defaults to https://api.smith.langchain.com), projectName (overrides env var), callerOptions with timeout and maxRetries, logLevel (debug|info|warn|error), hideInputs, and hideOutputs booleans. Dynamic per-span configuration uses withLangsmithMetadata helper which accepts projectName, sessionId, and sessionName fields. The helper merges with existing metadata and works with buildTracingOptions from @mastra/observability. Runtime project routing is supported using requestContext for conditional logic based on user-tier or other runtime conditions.
Hành động
- Install package: npm install @mastra/langsmith@latest (or pnpm/yarn/bun equivalent)
- Set required env var: LANGSMITH_API_KEY=ls-xxxxxxxxxxxx
- Optional env vars: LANGCHAIN_PROJECT (default project), LANGSMITH_BASE_URL (for self-hosted)
- Import: import { LangSmithExporter } from ‘@mastra/langsmith’
- Add to Observability config: new Observability({ configs: { langsmith: { serviceName: ‘my-service’, exporters: [new LangSmithExporter()] } } })
- For explicit config, pass apiKey and other options directly to constructor
- For dynamic routing, use withLangsmithMetadata with buildTracingOptions in Agent defaultOptions, optionally using requestContext for runtime-based project/session routing
Kết quả
Traces are sent to LangSmith where users gain insights into model performance, debugging capabilities, and evaluation workflows. Project routing and session grouping enable organization of traces by customer, environment, or feature.
Điều kiện áp dụng
Requires @mastra/core and @mastra/observability packages. Works with Mastra Agent instances using defaultOptions with tracingOptions.
Nội dung gốc (Original)
LangSmith exporter
LangSmith is LangChain’s platform for monitoring and evaluating LLM applications. The LangSmith exporter sends your traces to LangSmith, providing insights into model performance, debugging capabilities, and evaluation workflows.
Installation
npm:
npm install @mastra/langsmith@latestpnpm:
pnpm add @mastra/langsmith@latestYarn:
yarn add @mastra/langsmith@latestBun:
bun add @mastra/langsmith@latestConfiguration
Prerequisites
- LangSmith Account: Sign up at smith.langchain.com
- API Key: Generate an API key in LangSmith Settings → API Keys
- Environment Variables: Set your credentials
# Required
LANGSMITH_API_KEY=ls-xxxxxxxxxxxx
# Optional
LANGCHAIN_PROJECT=my-project # Default project for traces
LANGSMITH_BASE_URL=https://api.smith.langchain.com # For self-hostedZero-Config Setup
With environment variables set, use the exporter with no configuration:
import { Mastra } from '@mastra/core'
import { Observability } from '@mastra/observability'
import { LangSmithExporter } from '@mastra/langsmith'
export const mastra = new Mastra({
observability: new Observability({
configs: {
langsmith: {
serviceName: 'my-service',
exporters: [new LangSmithExporter()],
},
},
}),
})Explicit Configuration
You can also pass credentials directly (takes precedence over environment variables):
import { Mastra } from '@mastra/core'
import { Observability } from '@mastra/observability'
import { LangSmithExporter } from '@mastra/langsmith'
export const mastra = new Mastra({
observability: new Observability({
configs: {
langsmith: {
serviceName: 'my-service',
exporters: [
new LangSmithExporter({
apiKey: process.env.LANGSMITH_API_KEY,
}),
],
},
},
}),
})Configuration options
Complete Configuration
new LangSmithExporter({
// Required credentials
apiKey: process.env.LANGSMITH_API_KEY!,
// Optional settings
apiUrl: process.env.LANGSMITH_BASE_URL, // Default: https://api.smith.langchain.com
projectName: 'my-project', // Project to send traces to (overrides LANGCHAIN_PROJECT env var)
callerOptions: {
// HTTP client options
timeout: 30000, // Request timeout in ms
maxRetries: 3, // Retry attempts
},
logLevel: 'info', // Diagnostic logging: debug | info | warn | error
// LangSmith-specific options
hideInputs: false, // Hide input data in UI
hideOutputs: false, // Hide output data in UI
})Environment variables
| Variable | Description |
|---|---|
LANGSMITH_API_KEY | Your LangSmith API key (required) |
LANGCHAIN_PROJECT | Default project name for traces (optional, defaults to “default”) |
LANGSMITH_BASE_URL | API URL for self-hosted instances (optional) |
The projectName config option takes precedence over the LANGCHAIN_PROJECT environment variable, allowing you to programmatically route traces to different projects.
Dynamic configuration
You can dynamically override LangSmith settings per-span using withLangsmithMetadata. This is useful for routing traces to different projects based on runtime conditions (e.g., customer, environment, or feature).
Using the Helper
Use withLangsmithMetadata with buildTracingOptions to set LangSmith-specific options:
import { Agent } from '@mastra/core/agent'
import { buildTracingOptions } from '@mastra/observability'
import { withLangsmithMetadata } from '@mastra/langsmith'
export const supportAgent = new Agent({
id: 'support-agent',
name: 'support-agent',
instructions: 'You are a helpful support agent.',
model: 'openai/gpt-5.4',
defaultOptions: {
tracingOptions: buildTracingOptions(withLangsmithMetadata({ projectName: 'customer-support' })),
},
})Dynamic Project Routing
Use requestContext to route traces to different projects based on runtime conditions.
import { Agent } from '@mastra/core/agent'
import { buildTracingOptions } from '@mastra/observability'
import { withLangsmithMetadata } from '@mastra/langsmith'
export const supportAgent = new Agent({
id: 'support-agent',
name: 'support-agent',
instructions: 'You are a helpful support agent.',
model: 'openai/gpt-5.4',
defaultOptions: ({ requestContext }) => {
const userTier = requestContext?.get('user-tier') as string
const userId = requestContext?.get('user-id') as string
return {
tracingOptions: buildTracingOptions(
withLangsmithMetadata({
projectName: userTier === 'enterprise' ? 'enterprise-traces' : 'standard-traces',
sessionId: userId,
}),
),
}
},
})Available Fields
The withLangsmithMetadata helper accepts these fields:
| Field | Type | Description |
|---|---|---|
projectName | string | Override the project for this trace |
sessionId | string | Group related traces by session |
sessionName | string | Display name for the session |
All fields are optional. The helper merges with any existing metadata, so you can call it multiple times or combine with other tracing options.
Related
Liên kết
- Nền tảng: Dev Framework · Mastra
- Nguồn: https://mastra.ai/docs/observability/tracing/exporters/langsmith
Xem thêm: