Studio Overview
Trust: ★★★☆☆ (0.90) · 0 validations · developer_reference
Published: 2026-05-11 · Source: crawler_authoritative
Tình huống
Mastra Studio documentation covering the interactive UI for building, testing, and managing agents, workflows, and tools during development and production.
Insight
Mastra Studio provides an interactive UI for building, testing, and managing agents, workflows, and tools. It runs locally during development and can be deployed to production. Studio offers real-time editing of agents, workflows, and other Mastra application components. The Agents feature allows direct chat interaction with agents, dynamic model switching, and adjustment of settings like temperature and top-p. Users can follow each step of agent reasoning, view tool call outputs, and observe traces and logs. The Workflows feature visualizes workflows as a graph with step-by-step execution and real-time interface updates showing the active step and path taken. Processors allow viewing input/output processors attached to agents, including guardrails, token limiters, and custom processors. MCP servers can be listed and their tools explored. Tools can be run in isolation to observe behavior and debug issues. Workspaces provide a built-in file browser for the agent’s workspace filesystem, with support for directory creation and file deletion in writable workspaces. Read-only workspaces are labeled accordingly. The Skills tab lists discovered skills with instructions, references, and metadata, supporting installation from skills.sh. Request context sets runtime variables that flow into agent instructions and tools through dependency injection, supporting JSON editing or schema-driven forms when a requestContextSchema is defined. Evaluation features include Scorers for asynchronous evaluation of agent outputs, Datasets for managing test case collections (import from CSV/JSON, input and ground-truth schemas, version pinning), and Experiments for running dataset items against agents/workflows/scorers with side-by-side comparison. Settings configure the Mastra instance URL, optional API prefix, custom headers for authentication tokens, and theme (dark/light/system). The server runs at http://localhost:4111 by default, configurable via the server option in src/mastra/index.ts with host and port settings.
Hành động
To start Studio: run npm run dev, pnpm run dev, yarn dev, or bun run dev (depending on package manager). Alternatively, run mastra dev directly. Access the Studio UI at http://localhost:4111 and the Swagger UI at http://localhost:4111/swagger-ui. To configure Studio, use the server option in src/mastra/index.ts to change the host and port. For HTTPS development, use the --https flag with mastra dev --https, which automatically generates a private key and certificate for localhost. Settings UI configures: Mastra instance URL (e.g. http://localhost:4111), API prefix (defaults to /api), custom headers for authentication tokens or routing headers, and theme selection (dark/light/system). For production deployment, deploy using the Mastra platform or your own infrastructure. Add authentication to control access to deployed Studio.
Kết quả
Studio provides an interactive UI at http://localhost:4111 for interacting with agents, workflows, and tools in real time during development, with Swagger UI available at http://localhost:4111/swagger-ui for REST API exploration.
Điều kiện áp dụng
Studio runs locally during development and can be deployed to production. HTTPS development requires the —https flag.
Nội dung gốc (Original)
Studio
Studio provides an interactive UI for building, testing, and managing your agents, workflows, and tools. Run it locally during development, add authentication, or deploy it to production so your team can manage agents, monitor performance, and gain insights through built-in observability.
Start Studio
If you created your application with create mastra, start the development server using the dev script. You can also run it directly with mastra dev.
npm:
npm run devpnpm:
pnpm run devYarn:
yarn devBun:
bun run devOnce the server is running, you can:
- Open the Studio UI at http://localhost:4111 to interact with your agents, workflows, and tools.
- Visit http://localhost:4111/swagger-ui to discover and interact with the underlying REST API.
While Studio is running, you can edit your agents, workflows, and other parts of your Mastra application in real time.
Deploy Studio
When you’re ready to share Studio with your team, you can deploy it to production using the Mastra platform or your own infrastructure. Visit the deployment docs to learn how.
Primitives
Agents
Chat with your agent directly, dynamically switch models, and tweak settings like temperature and top-p to understand how they affect the output.
When you interact with your agent, you can follow each step of its reasoning, view tool call outputs, and observe traces and logs to see how responses are generated. You can also attach scorers to measure and compare response quality over time.
Use Editor to let non-technical team members iterate on agents, version every change, and run experiments without redeploying.
Workflows
Visualize your workflow as a graph and run it step by step with a custom input. During execution, the interface updates in real time to show the active step and the path taken.
When running a workflow, you can also view detailed traces showing tool calls, raw JSON outputs, and any errors that might have occurred along the way.
Processors
View the input and output processors attached to each agent. The agent detail panel lists every processor by name and type, so you can verify your guardrails, token limiters, and custom processors are wired up correctly before testing.
See processors and guardrails for configuration details.
MCP servers
List the MCP servers attached to your Mastra instance and explore their available tools.
Tools
Run tools on their own to observe behavior and test them before assigning them to an agent. If something goes wrong, re-run a tool in isolation to debug the issue.
Workspaces
Browse the files in your agent’s workspace filesystem using a built-in file browser. Switch between workspace mounts, create directories, and view file contents with syntax highlighting. Writable workspaces allow directory creation and file deletion; read-only workspaces are labeled accordingly. The Skills tab lists all discovered skills with their instructions, references, and metadata. Install community skills from skills.sh or remove existing ones.
See workspaces for configuration details.
Request context
Set runtime variables that flow into your agent’s instructions and tools through dependency injection. Edit request context as JSON or use a schema-driven form when your agent defines a requestContextSchema. Values persist across test chats and experiments, so you can trigger conditional flows without restarting.
See request context for configuration details.
Evaluation
Scorers
The Scorers tab displays the results of your agent’s scorers as they run. When messages pass through your agent, the defined scorers evaluate each output asynchronously and render their results here. This allows you to understand how your scorers respond to different interactions, compare performance across test cases, and identify areas for improvement.
Datasets
Create and manage collections of test cases to evaluate your agents and workflows. Import items from CSV or JSON, define input and ground-truth schemas, and pin to specific versions so you can reproduce experiments exactly. Run experiments with scorers to compare quality across prompts, models, or code changes.
See datasets overview for the full API and versioning details.
Experiments
Run all items in a dataset against an agent, workflow, or scorer and collect the results in one place. Select a target, optionally attach scorers, and trigger the experiment. The results view shows each item’s input, output, status, and individual score breakdowns. Compare two experiments side by side to measure the impact of prompt, model, or code changes.
See datasets overview for setup details.
Observability
Visit the Studio observability docs to learn more.
Settings
Configure the connection between Studio and your Mastra server. The settings page includes:
- Mastra instance URL: The base URL of your Mastra server (e.g.
http://localhost:4111). - API prefix: Optional path prefix for all API requests (defaults to
/api). - Custom headers: Add key-value pairs sent with every request, useful for authentication tokens or routing headers.
- Theme: Switch between dark, light, or system theme.
Code configuration
In addition to the settings UI, you can configure the local development server and Studio also through the server option in your src/mastra/index.ts.
By default, Studio runs at http://localhost:4111. You can change the host and port.
Mastra also supports HTTPS development through the --https flag, which automatically creates and manages certificates for your project. When you run mastra dev --https, a private key and certificate are generated for localhost (or your configured host). Visit the HTTPS reference to learn more.
Next steps
- Learn how to deploy Studio for production use.
- Add authentication to control access to your deployed Studio.
- Explore Studio observability to monitor agent performance through metrics, traces, and logs.
Liên kết
- Nền tảng: Dev Framework · Mastra
- Nguồn: https://mastra.ai/docs/studio/overview
Xem thêm: