Documentation Index
Fetch the complete documentation index at: https://docs.moda.dev/llms.txt
Use this file to discover all available pages before exploring further.
What is Moda?
Moda is observability for AI agents. Add a few lines of code to your existing OpenAI, Anthropic, or framework calls, and Moda automatically clusters every production conversation, detects behavior signals and tool failures, and surfaces the intents your agent has never seen before. Most agent failures aren’t bugs. They’re intents you didn’t know your users had. Moda finds them first.Get started in 5 minutes
Install the SDK, add two lines of code, and see your first cluster in the dashboard.
Why use Moda
All providers, no code changes
Works with OpenAI, Anthropic, AWS Bedrock, OpenRouter, Azure, and Vercel AI SDK. Switch models without touching the integration.
Live taxonomy of intent
Every conversation is clustered into a hierarchical taxonomy that reorganizes itself as your traffic shifts.
Emergent intent detection
See the intents your agent has never seen before, ranked by growth, before any of them have a canonical answer.
Built to be called
Same data the dashboard renders is available via the CLI, the Data API, and the MCP server.
How it works
- Install the Moda SDK in your application.
- Initialize with your API key. The SDK captures LLM calls in the background, no manual instrumentation.
- View clusters, behaviors, and failures in the Moda dashboard, or query them from your terminal.
Next steps
Quickstart
Step-by-step guide to your first integration.
Ingestion options
All the ways to send data to Moda.
Data API + CLI
Query every cluster, behavior signal, and failure from your terminal or your CI.
Use cases
What teams build on top of the Moda Data API.