OpenTelemetry
- HobbyPublic Beta
- ProPublic Beta
- TeamPublic Beta
- Self HostedPublic Beta
OpenTelemetry support in Langfuse is experimental. All APIs may change at any point in time without prior notice. On Langfuse Cloud, we have strict rate-limits in place for OpenTelemetry span ingestion.
We share this feature to gather feedback and improve it based on our user’s needs. Please share all feedback you have in the OpenTelemetry Support GitHub Discussion.
OpenTelemetry is a CNCF project that provides a set of specifications, APIs, libraries that define a standard way to collect distributed traces and metrics from your application. OpenTelemetry maintains an experimental set of Semantic Conventions for GenAI attributes on traces. In addition, to our native SDKs and our vendor specific integrations, we added experimental support for OpenTelemetry.
Getting Started
To get started using the OpenTelemetry integration, you will need the Langfuse Trace API endpoint and your Langfuse API keys.
For our EU data region the endpoint is https://cloud.langfuse.com/api/public/otel
and for the US data region it is https://us.cloud.langfuse.com/api/public/otel
.
We will use the EU data region in the following examples.
In addition, you will need the Langfuse API keys, e.g. pk-lf-1234567890
and sk-lf-1234567890
.
Run $ echo -n ${LANGFUSE_PUBLIC_KEY}:${LANGFUSE_SECRET_KEY} | base64
to get the base64 encoded API keys (referred to as “AUTH_STRING” going forward).
Using those parameters, you can configure the OpenTelemetry exporters for your tracing framework as follows:
Add a Langfuse exporter to your OpenTelemetry Collector configuration:
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
processors:
batch:
memory_limiter:
# 80% of maximum memory up to 2G
limit_mib: 1500
# 25% of limit up to 2G
spike_limit_mib: 512
check_interval: 5s
exporters:
otlp/langfuse:
endpoint: "cloud.langfuse.com/api/public/otel"
headers:
Authorization: "Basic ${AUTH_STRING}"
service:
pipelines:
traces:
receivers: [otlp]
processors: [memory_limiter, batch]
exporters: [otlp/langfuse]
Property Mapping
Langfuse accepts any span that adheres to the OpenTelemetry specification.
In addition, we map many GenAI specific properties to properties in the Langfuse data model to provide a seamless experience when using OpenTelemetry with Langfuse.
First and foremost, we stick to the OpenTelemetry Gen AI Conventions, but also map vendor specific properties from common frameworks.
All attributes and resourceAttributes are available within the Langfuse metadata
property as a fallback.
Below, we share a non-exhaustive list of mappings that Langfuse applies:
OpenTelemetry Attribute | Langfuse Property | Description |
---|---|---|
gen_ai.usage.cost | costDetails.total | The total cost of the request. |
gen_ai.usage.* | usageDetails.* | Maps all keys within usage aside from cost to usageDetails . Token properties are simplified to input , output , and total . |
gen_ai.request.model | model | The model used for the request. |
gen_ai.response.model | model | The model used for the response. |
gen_ai.request.* | modelParameters | Maps all keys within request to modelParameters . |
langfuse.session.id | sessionId | The session ID for the request. |
session.id | sessionId | The session ID for the request. |
langfuse.user.id | userId | The user ID for the request. |
user.id | userId | The user ID for the request. |
gen_ai.prompt | input | Input field. Deprecated by OpenTelemetry as event properties should be preferred. |
gen_ai.completion | output | Output field. Deprecated by OpenTelemetry as event properties should be preferred. |