Installation
Add OpenTelemetry packages and AI instrumentation libraries to your project:opentelemetry-api
and opentelemetry-sdk
packages provide the core OpenTelemetry functionality. The opentelemetry-exporter-otlp
package enables sending traces to Galileo’s OTLP endpoint
You can then add the relevant OpenInferencepackages for the framework or LLM that you are using. For example, to add the packages for LangChain and OpenAI, install the following:
Configure the OTel endpoint
Set up the exporter to send traces to Galileo’s OTel endpoint. The OTel endpoint is different from Galileo’s regular API endpoint and is specifically designed to receive telemetry data in the OTLP format.If you’re using a self-hosted or custom Galileo deployment, replace
app.galileo.ai
with your deployment URL.Set up authentication headers
Format your Galileo API key and project information for OpenTelemetry. OpenTelemetry requires headers to be set in theOTEL_EXPORTER_OTLP_TRACES_HEADERS
environment variable in a specific comma-separated format.
Configure the tracer provider
Assemble the complete observability system with service metadata. This creates the pipeline that batches and exports traces to the OTel endpoint.- Creates a
Resource
that identifies your service with metadata - Sets up a
TracerProvider
that manages trace creation and processing - Configures a
BatchSpanProcessor
that efficiently batches traces before sending them to Galileo - Registers the tracer provider globally so all instrumentation can use it
Apply AI instrumentation
Now you can enable automatic tracing for your framework and LLM operations using OpenInference instrumentors. These add AI-specific semantic conventions to your traces. For example, to instrument LangChain and OpenAI, use the following code:- Automatic capture of LLM calls, token usage, and model performance metrics
- AI-specific span attributes like
gen_ai.request.model
,gen_ai.response.content
, andgen_ai.usage.*
- Semantic conventions that make your traces more meaningful in Galileo’s dashboard
- Framework-specific instrumentation for LangGraph workflows and OpenAI API calls
CrewAI and OpenTelemetry
Add Galileo logging to existing CrewAI applications using the event listener.
LangGraph and OpenAI with OpenTelemetry
Complete tutorial for instrumenting LangGraph workflows with OpenAI and OpenInference
Mastra.ai and OpenTelemetry
Sample application for instrumenting Galileo using OpenTelemetry in a Mastra.ai agent.