Quickstart
Get MetricChat running in under 5 minutes.
Install
Run the following command to start MetricChat with SQLite (no external database required):
docker run --pull always -d -p 3000:3000 metricchat/metricchatTo use PostgreSQL instead of SQLite, set the MC_DATABASE_URL environment variable:
docker run --pull always -d -p 3000:3000 \
-e MC_DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/dbname \
metricchat/metricchatOpen http://localhost:3000 to begin setup.
Onboarding
MetricChat walks you through a setup wizard on first launch. Each step takes less than a minute.
Step 1: Welcome
The setup wizard opens automatically on first launch. It guides you through connecting a model, a data source, and any initial context before you start chatting.
Step 2: Configure LLM
Connect to any LLM provider using your own API key. Supported providers:
- OpenAI — GPT-4o, GPT-4o mini, and other OpenAI models
- Anthropic — Claude Sonnet, Claude Opus, and other Claude models
- Google — Gemini Pro and Gemini Flash models
- Azure OpenAI — OpenAI models deployed on Azure
- Ollama — Local models with no external API required
MetricChat does not proxy your API key. It is encrypted and stored only within your own deployment.
Step 3: Connect a Data Source
Select from any of the supported data source types:
- PostgreSQL
- Snowflake
- BigQuery
- MySQL
- Amazon Redshift
- AWS Athena
- DuckDB
- Salesforce
- And more
You can also upload a CSV or Excel file to instantly create a queryable DuckDB data source — no database connection or setup required. This is useful for quick ad-hoc analysis or working with data you already have locally.
Step 4: Select Tables
Choose which tables the AI is allowed to access and query. You can select all tables or restrict access to specific ones. This selection can be changed later from the data source settings.
Step 5: Add Context
Two optional but recommended steps to improve answer quality:
- Suggest Instructions — Add business-specific rules, KPI definitions, and terminology. MetricChat uses these in every agent run to keep answers aligned with how your business works.
- Enrich Context — Connect a Git repository to pull in dbt model descriptions, LookML definitions, markdown documentation, or AGENTS.md files.
Both can be skipped now and configured later from the settings panel.
Step 6: Start Asking Questions
Setup is complete. Type a question in the chat interface and MetricChat will query your data, reason over the results, and return an answer with supporting charts or tables.
Next Steps
- Chat with Data — How to ask questions and get the most from your results
- Create Dashboards — Pin results and build shareable dashboards
- Add Instructions — Teach the AI your business rules and terminology
- Agent Architecture — Understand how the reasoning loop works
- Context Management — Learn how context is constructed for each agent run
- Deployment — Docker Compose and Kubernetes for production deployments
Introduction
MetricChat is the open-source AI Analyst — connect any LLM to any data source with centralized context management, trust, observability, and control.
Deployment
MetricChat supports several deployment options, from a single Docker container to a full Kubernetes cluster with managed databases.