Overview
This guide provides configuration instructions for connecting various Model Context Protocol clients to the Timbr MCP Server.
The Timbr MCP Server is deployed as part of the Timbr API Service. Ensure it is deployed in your environment before using it.
Authentication
You can connect to the MCP server using your Timbr token (x-api-key) or OAuth which needs to be pre-configured in your Timbr API Service
Please follow this guide to set-up OAuth for the MCP server.
Supported Clients
The following clients have been tested with the Timbr MCP Server:
- Visual Studio Code - GitHub Copilot MCP integration
- Cursor - AI-powered code editor
- Windsurf - Codeium's AI IDE
- Claude Desktop - Anthropic's desktop application
- ChatGPT - OpenAI's conversational AI
- Google AI Studio - Google's Gemini integration
- Google Antigravity - Google's MCP-enabled platform
- Microsoft Copilot Studio - Microsoft's AI agent
Server Information
| Property | Value |
|---|---|
| Server Name | timbr |
| Transport | HTTP (Streamable HTTP) |
| Base URL | https://<your-timbr-server>/timbr/api/mcp/ |
| Health Check | https://<your-timbr-server>/timbr/api/mcp/health |
Authentication
All connections require an API key passed via the x-api-key header:
| Header | Description | Required |
|---|---|---|
x-api-key | Timbr API token (prefix tk_) | Yes |
x-agent | Target agent name | No (if default configured or ontology specified) |
x-ontology | Target ontology name | No (if agent specified or default configured) |
Available Tools
AI Query Tools
These tools use an LLM to translate natural language into SQL queries.
| Tool | Description | Key Parameters |
|---|---|---|
query_data | Execute natural language queries, return data rows | prompt, max_limit, concepts_list |
ask_question | Get natural language answers from your data | prompt, note, show_result_set |
generate_sql | Generate SQL from natural language without execution | prompt, db_is_case_sensitive |
identify_concept | Find the most relevant concept for a question | prompt, include_tags |
Metadata & Exploration Tools
These tools provide direct access to knowledge graph metadata and SQL execution without requiring an LLM.
| Tool | Description | Key Parameters |
|---|---|---|
execute_sql | Execute a raw Timbr SQL query on a knowledge graph | query, ontology, datasource, max_results |
list_ontologies | List all available knowledge graphs | max_results |
list_agents | List all available pre-configured agents | max_results |
list_datasources | List datasources (all, or filtered by knowledge graph) | ontology, max_results |
describe_knowledge_graph | Get structural metadata: concepts, properties, relationships, views, mappings, datasources, and jobs | ontology, include, max_results |
An AI agent can chain these tools together for effective data exploration:
list_ontologies- discover available knowledge graphsdescribe_knowledge_graph- explore the schema (properties, relationships, views)generate_sqlorquery_data- query the data using natural languageexecute_sql- run or refine generated SQL directly
describe_knowledge_graph Sections
The include parameter controls which metadata sections are returned. If omitted, the default sections are: ontology, properties, relationships, views.
| Section | Description | SQL Equivalent |
|---|---|---|
ontology | Knowledge graph overview | SHOW ONTOLOGY |
properties | All concept properties | SELECT * FROM timbr.SYS_PROPERTIES |
relationships | Concept relationships | SELECT * FROM timbr.SYS_CONCEPT_RELATIONSHIPS |
views | Available views (non-cube) | SELECT * FROM timbr.SYS_VIEWS |
mappings | Source mappings | SELECT * FROM timbr.SYS_MAPPINGS |
datasources | Available datasources | SHOW DATASOURCES |
jobs | Scheduled jobs | SELECT * FROM timbr.SYS_JOBS |
max_results Limit
All metadata and exploration tools accept an optional max_results parameter. This value is capped at the server's configured ROW_LIMIT (default: 3000). If max_results exceeds ROW_LIMIT, the server limit is used instead.
Common Configuration Options
Environment Variables
These can be set on the Timbr server to configure default behavior:
| Variable | Description | Default |
|---|---|---|
MCP_DEFAULT_ONTOLOGY | Default ontology when not specified | None |
MCP_CORS_ORIGINS | Allowed CORS origins | * |
MCP_DEBUG | Enable debug logging | false |
ROW_LIMIT | Maximum rows returned by metadata/exploration tools | 3000 |
Tool-Specific Parameters
When calling AI query tools, you can pass these optional parameters:
| Parameter | Applicable Tools | Description |
|---|---|---|
ontology | All AI query tools | Override default ontology |
verbose | All AI query tools | Return full metadata (default: false) |
concepts_list | All AI query tools | Limit to specific concepts |
views_list | query_data, ask_question, generate_sql | Limit to specific views |
include_tags | All except generate_sql | Filter by tags |
max_limit | query_data, ask_question | Max rows to return (default: 500) |
note | ask_question | Additional context for the LLM |
General Configuration
For any MCP-compatible client, use the following configuration template:
HTTP Transport (Recommended)
{
"mcpServers": {
"timbr": {
"transport": "http",
"url": "https://<your-timbr-server>/timbr/api/mcp/",
"headers": {
"x-api-key": "<your-api-key>",
"x-agent": "<agent-name>"
}
}
}
}
SSE Transport (Alternative)
In this example we use x-ontology which can also be used instead of x-agent
{
"mcpServers": {
"timbr": {
"transport": "sse",
"url": "https://<your-timbr-server>/timbr/api/mcp/sse",
"headers": {
"x-api-key": "<your-api-key>",
"x-ontology": "<your-ontology>"
}
}
}
}
Testing with curl
Verify your connection:
# Health check (no auth required)
curl https://<your-timbr-server>/timbr/api/mcp/health
# Initialize connection
curl -X POST https://<your-timbr-server>/timbr/api/mcp/ \
-H "Content-Type: application/json" \
-H "x-api-key: tk_your_api_key" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {}
}
}'
# List available tools
curl -X POST https://<your-timbr-server>/timbr/api/mcp/ \
-H "Content-Type: application/json" \
-H "x-api-key: tk_your_api_key" \
-d '{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/list"
}'
# Execute a query
curl -X POST https://<your-timbr-server>/timbr/api/mcp/ \
-H "Content-Type: application/json" \
-H "x-api-key: tk_your_api_key" \
-H "x-ontology: my_ontology" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "query_data",
"arguments": {
"prompt": "Show all customers",
"max_limit": 10
}
}
}'
Client-Specific Notes
Microsoft Copilot Studio
Copilot Studio has specific requirements and limitations when connecting to MCP servers:
- Streamable HTTP only - SSE transport is not supported (deprecated after August 2025)
- OAuth with DCR - requires
MCP_OAUTH_DCR_ENABLEDto betrue(it is by default) withMCP_OAUTH_CLIENT_IDandMCP_OAUTH_CLIENT_SECRETconfigured on the server - Tool inputs - Copilot Studio does not expose individual MCP tool
inputSchemaparameters in the UI. To set defaults likex-agentorx-ontology, edit the connector in Power Apps after creation - Schema limitations - tools with
$reftype inputs are filtered out;exclusiveMinimummust be a boolean, not an integer
For detailed setup instructions, see the Copilot Studio section in the OAuth guide.
Troubleshooting
Connection Issues
| Issue | Solution |
|---|---|
| Connection refused | Verify server URL and that Timbr API is running |
| 401 Unauthorized | Check API key is correct and starts with tk_ |
| 403 Forbidden | Verify API key has access to the specified ontology |
| Ontology not found | Set x-ontology header or MCP_DEFAULT_ONTOLOGY on server |
Common Errors
| Error | Cause | Solution |
|---|---|---|
| "LLM type is required" | Server LLM not configured | Configure LLM settings on Timbr server |
| "Invalid JSON-RPC" | Malformed request | Check request body format |