Mastra Studio Overview
Trust: ★★★☆☆ (0.90) · 0 validations · developer_reference
Published: 2026-05-13 · Source: crawler_authoritative
Tình huống
Guide to using Mastra Studio interactive UI for building, testing, and managing agents, workflows, and tools during development and production.
Insight
Studio provides an interactive UI for building, testing, and managing agents, workflows, and tools. It supports local development and production deployment. Key features include: Agents - chat directly with agents, switch models dynamically, tweak settings like temperature and top-p, view tool call outputs and reasoning traces; Workflows - visualize workflows as graphs, run step by step with custom input, view traces and errors in real time; Processors - view input/output processors attached to agents, verify guardrails, token limiters, and custom processors; MCP Servers - list and explore MCP servers attached to Mastra; Tools - run tools independently to test behavior before assigning to agents; Workspaces - browse workspace filesystem, create directories, view file contents with syntax highlighting, manage skills; Request Context - set runtime variables through dependency injection, edit as JSON or schema-driven form. Evaluation features include Scorers for measuring response quality, Datasets for test case collections (import CSV/JSON, version pinning), and Experiments to run dataset items against agents/workflows and compare results. Settings allow configuring Mastra instance URL, API prefix, custom headers for authentication, and theme (dark/light/system). Studio runs at localhost:4111 by default with Swagger UI at /swagger-ui.
Hành động
Start Studio locally by running ‘npm run dev’ (or ‘pnpm run dev’, ‘yarn dev’, ‘bun run dev’) after creating an application with ‘create mastra’. Access the Studio UI at localhost:4111 and Swagger UI at localhost:4111/swagger-ui. Edit agents, workflows, and other application parts in real time while Studio is running. Configure via the settings UI or server option in src/mastra/index.ts. Change host and port defaults as needed. Enable HTTPS development with ‘mastra dev —https’ flag which auto-generates certificates for localhost. For production, deploy using Studio on Mastra platform or your own infrastructure.
Kết quả
Studio provides an interactive UI accessible at localhost:4111 for building, testing, and managing agents, workflows, and tools, with Swagger API documentation at /swagger-ui. Enables real-time editing during development, observability through traces and logs, evaluation through scorers and experiments, and team collaboration in production deployments.
Điều kiện áp dụng
Studio requires a Mastra application created with ‘create mastra’. HTTPS flag requires local development setup.
Nội dung gốc (Original)
Studio
Studio provides an interactive UI for building, testing, and managing your agents, workflows, and tools. Run it locally during development, add authentication, or deploy it to production so your team can manage agents, monitor performance, and gain insights through built-in observability.
Start Studio
If you created your application with create mastra, start the development server using the dev script. You can also run it directly with mastra dev.
npm:
npm run devpnpm:
pnpm run devYarn:
yarn devBun:
bun run devOnce the server is running, you can:
- Open the Studio UI at
localhost:4111to interact with your agents, workflows, and tools. - Visit
localhost:4111/swagger-uito discover and interact with the underlying REST API.
While Studio is running, you can edit your agents, workflows, and other parts of your Mastra application in real time.
Deploy Studio
When you’re ready to share Studio with your team, you can deploy it to production using Studio on Mastra platform or your own infrastructure. Visit the deployment docs to learn how.
Primitives
Agents
Chat with your agent directly, dynamically switch models, and tweak settings like temperature and top-p to understand how they affect the output.
When you interact with your agent, you can follow each step of its reasoning, view tool call outputs, and observe traces and logs to see how responses are generated. You can also attach scorers to measure and compare response quality over time.
While an agent response is streaming, you can send a follow-up message in the same thread. Studio shows the message as pending until the stream confirms it, then continues the response below that follow-up. Other Studio tabs that have the same thread open can observe the active stream.
Use Editor to let non-technical team members iterate on agents, version every change, and run experiments without redeploying.
Workflows
Visualize your workflow as a graph and run it step by step with a custom input. During execution, the interface updates in real time to show the active step and the path taken.
When running a workflow, you can also view detailed traces showing tool calls, raw JSON outputs, and any errors that might have occurred along the way.
Processors
View the input and output processors attached to each agent. The agent detail panel lists every processor by name and type, so you can verify your guardrails, token limiters, and custom processors are wired up correctly before testing.
See processors and guardrails for configuration details.
MCP servers
List the MCP servers attached to your Mastra instance and explore their available tools.
Tools
Run tools on their own to observe behavior and test them before assigning them to an agent. If something goes wrong, re-run a tool in isolation to debug the issue.
Workspaces
Browse the files in your agent’s workspace filesystem using a built-in file browser. Switch between workspace mounts, create directories, and view file contents with syntax highlighting. Writable workspaces allow directory creation and file deletion; read-only workspaces are labeled accordingly. The Skills tab lists all discovered skills with their instructions, references, and metadata. Install community skills from skills.sh or remove existing ones.
See workspaces for configuration details.
Request context
Set runtime variables that flow into your agent’s instructions and tools through dependency injection. Edit request context as JSON or use a schema-driven form when your agent defines a requestContextSchema. Values persist across test chats and experiments, so you can trigger conditional flows without restarting.
See request context for configuration details.
Evaluation
Scorers
The Scorers tab displays the results of your agent’s scorers as they run. When messages pass through your agent, the defined scorers evaluate each output asynchronously and render their results here. This allows you to understand how your scorers respond to different interactions, compare performance across test cases, and identify areas for improvement.
Datasets
Create and manage collections of test cases to evaluate your agents and workflows. Import items from CSV or JSON, define input and ground-truth schemas, and pin to specific versions so you can reproduce experiments exactly. Run experiments with scorers to compare quality across prompts, models, or code changes.
See datasets overview for the full API and versioning details.
Experiments
Run all items in a dataset against an agent, workflow, or scorer and collect the results in one place. Select a target, optionally attach scorers, and trigger the experiment. The results view shows each item’s input, output, status, and individual score breakdowns. Compare two experiments side by side to measure the impact of prompt, model, or code changes.
See datasets overview for setup details.
Observability
Visit the Studio observability docs to learn more.
Settings
Configure the connection between Studio and your Mastra server. The settings page includes:
- Mastra instance URL: The base URL of your Mastra server (e.g.
http://localhost:4111). - API prefix: Optional path prefix for all API requests (defaults to
/api). - Custom headers: Add key-value pairs sent with every request, useful for authentication tokens or routing headers.
- Theme: Switch between dark, light, or system theme.
Code configuration
In addition to the settings UI, you can configure the local development server and Studio also through the server option in your src/mastra/index.ts.
By default, Studio runs at http://localhost:4111. You can change the host and port.
Mastra also supports HTTPS development through the --https flag, which automatically creates and manages certificates for your project. When you run mastra dev --https, a private key and certificate are generated for localhost (or your configured host). Visit the HTTPS reference to learn more.
Next steps
- Learn how to deploy Studio for production use.
- Add authentication to control access to your deployed Studio.
- Explore Studio observability to monitor agent performance through metrics, traces, and logs.
Liên kết
- Nền tảng: Dev Framework · Mastra
- Nguồn: https://mastra.ai/docs/studio/overview
Xem thêm: