Installation

To integrate OpenTelemetry (OTEL) tracing with Galileo you need the following dependencies:

pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation


Authentication

  1. Generate an API key from your Galileo account.
  2. Copy your project name and log stream name from the Galileo console.

Set the authentication headers in your environment:

import os

headers = {
    "Galileo-API-Key": "<YOUR_API_KEY>",           # Replace with your actual API key
    "project": "<YOUR_PROJECT_NAME>",         # Replace with your project name
    "logstream": "<YOUR_RUN_NAME>",          # Must match the log stream name in Galileo
}

os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = ",".join([f"{k}={v}" for k, v in headers.items()])

Once the headers are set, we can initialize the OTEL logging.

Python OpenTelemetry setup

Finally we can setup the OTEL logging to send traces to Galileo. Set the endpoint to your Galileo API. Place this setup at the top of your script, before initializing your LLM or pipeline framework:

from opentelemetry.sdk import trace as trace_sdk
from opentelemetry import trace as trace_api
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.trace.export import ConsoleSpanExporter
from openinference.instrumentation.langchain import LangChainInstrumentor

# OTEL exporter endpoint
endpoint = "https://app.galileo.ai/api/galileo/otel/traces"

# Configure OTEL tracer provider
tracer_provider = trace_sdk.TracerProvider()

# Sends traces to Galileo
tracer_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter(endpoint)))

# Optional: Prints traces to console for local debugging
tracer_provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))

trace_api.set_tracer_provider(tracer_provider=tracer_provider)

# Instrument LangChain (or replace with your framework's instrumentor)
LangChainInstrumentor().instrument(tracer_provider=tracer_provider)

Note: Apply instrumentation before importing or initializing your framework (e.g., LangChain, OpenAI).


Full Working Example

Check out a complete working example on Colab: πŸ”— OpenTelemetry + Galileo Integration Notebook


Summary

To enable OTEL logging with Galileo:

  1. Set the OTEL headers using your API key, project name, and log stream name.
  2. Set the exporter endpoint: https://app.galileo.ai/api/galileo/otel/traces
  3. Register your tracer provider and processors.
  4. Import and apply the OpenInference instrumentor(s) before initializing your application.

Supported Instrumentors

These can be used individually or in combination, depending on your framework:


FAQ

How do I create a project?

Log in to the Galileo console and create a new logging project. After creation, copy the project name and log stream name from the interface.


Which instrumentors should I import?

Choose the instrumentor(s) matching your framework:

# OpenAI
from openinference.instrumentation.openai import OpenAIInstrumentor

# LlamaIndex
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor

# DSPy
from openinference.instrumentation.dspy import DSPyInstrumentor

# Bedrock
from openinference.instrumentation.bedrock import BedrockInstrumentor

# LangChain
from openinference.instrumentation.langchain import LangChainInstrumentor

# Mistral AI
from openinference.instrumentation.mistralai import MistralAIInstrumentor

# Guardrails
from openinference.instrumentation.guardrails import GuardrailsInstrumentor

# Vertex AI
from openinference.instrumentation.vertexai import VertexAIInstrumentor

# CrewAI
from openinference.instrumentation.crewai import CrewAIInstrumentor
from openinference.instrumentation.langchain import LangChainInstrumentor

# Haystack
from openinference.instrumentation.haystack import HaystackInstrumentor

# LiteLLM
from openinference.instrumentation.litellm import LiteLLMInstrumentor

# Groq
from openinference.instrumentation.groq import GroqInstrumentor

# Instructor
from openinference.instrumentation.instructor import InstructorInstrumentor

# Anthropic
from openinference.instrumentation.anthropic import AnthropicInstrumentor

Instructor Example with Client:

from openinference.instrumentation.instructor import InstructorInstrumentor
from openinference.instrumentation.openai import OpenAIInstrumentor
from openai import OpenAI
import instructor

client = instructor.from_openai(OpenAI())

What is the correct OTEL endpoint?

Use the Galileo API base URL with the /otel/traces path:

https://app.galileo.ai/api/galileo/otel/traces
or cluster your custom cluster
http://api.YOUR_CLUSTER_URL/otel/traces

If your console URL is https://console.your_cluster_url.com, the endpoint becomes http://api.your_cluster_url.com/otel/traces.


How do I verify it’s working?

  • Use ConsoleSpanExporter to see local trace output.
  • Check the Galileo dashboard for incoming traces.
  • Add a test span like this:
from opentelemetry import trace

tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("test-span"):
    print("Tracing is active")

Troubleshooting

  • βœ… Confirm your API key, project name, and log stream name are correct.
  • βœ… Make sure the environment variable is correctly formatted.
  • βœ… Ensure the endpoint is accessible from your environment.
  • βœ… Instrument your framework before using it.
  • βœ… Restart environment after changing environment variables.
  • βœ… Use print(os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS']) to debug your header string.

Resources

Notebooks

πŸ”— Anthropic + Galileo + Otel Notebook πŸ”— Crew AI + Galileo + Otel Notebook πŸ”— DSPY + Galileo + Otel Notebook πŸ”— Groq + Galileo + Otel Notebook πŸ”— LLamaIndex + Galileo + Otel Notebook πŸ”— LangChain + Galileo + Otel Notebook πŸ”— LiteLLM + Galileo + Otel Notebook πŸ”— Mistral + Galileo + Otel Notebook πŸ”— OpenAI + Galileo + Otel Notebook πŸ”— Vertexai + Galileo + Otel Notebook