MCP Server: Langfuse
Comprehensive MCP for Langfuse observability. Access traces, scores, datasets, and prompt management through LLMs.
The Problem
Debugging LLM applications requires constantly switching between code and the Langfuse dashboard. Accessing traces, scores, and datasets programmatically is cumbersome.
The Solution
A comprehensive MCP server for Langfuse observability. Access traces, manage datasets, query scores, and handle prompt versioning—all through your LLM interface.
Architecture
%%{init: {'theme': 'dark', 'themeVariables': { 'fontFamily': 'Inter', 'secondaryColor': '#1e293b', 'primaryColor': '#3b82f6', 'primaryBorderColor': '#60a5fa' }}}%%
graph LR
subgraph Client ["Client"]
A["LLM<br/>(Claude/GPT)"]
end
subgraph Protocol ["MCP Protocol"]
A <-->|"JSON-RPC"| B["MCP Server"]
end
subgraph Backend ["Backend"]
B <--> C["API"]
B --> D["Tools"]
end
classDef default fill:#0f172a,stroke:#334155,color:#fff,stroke-width:1px;
classDef agent fill:#0f172a,stroke:#3b82f6,color:#fff;
classDef process fill:#0f172a,stroke:#334155,color:#fff;
class A agent;
class B,C,D process;
AI Agent
Process Step
Tags
TypeScriptLangfuseObservability
Outcomes
- 22 tools for complete observability access
- Traces, scores, datasets, and sessions management
- Works with Langfuse Cloud and self-hosted