Logging using OpenTelemetry to Galileo with OpenInference in Python
Logging via OpenTelemetry to Galileo’s Observe platform; Simply modify the endpoint and headers to be up and running.
Installation
To integrate OpenTelemetry (OTEL) tracing with Galileo you need the following dependencies:
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp opentelemetry-instrumentation
Authentication
First create an API key to authenticate. Copy your project name and log stream name
Once the headers are set, we can initialize the OTEL logging.
Python setup
Finally we can setup the OTEL logging to send traces to Galileo. For this we need to set the endpoint and the
Summary
- Set the headers with the API key, project name and log stream name.
- Initialize the OTEL logging with the correct endpoint.
- Instrument the needed instrumentors.
- Start logging with OTEL.
Supported Instrumentors are: OpenAI: openinference-instrumentation-openai LLama Index: openinference-instrumentation-llama-index DSPy: openinference-instrumentation-dspy Bedrock: openinference-instrumentation-bedrock Langchain: openinference-instrumentation-langchain Mistral: openinference-instrumentation-mistralai Guardrails: openinference-instrumentation-guardrails Vertex AI: openinference-instrumentation-vertexai CrewAI: openinference-instrumentation-crewai Haystack: openinference-instrumentation-haystack LiteLLM: openinference-instrumentation-litellm Groq: openinference-instrumentation-groq Instructor:openinference-instrumentation-instructor Anthropic: openinference-instrumentation-anthropic
FAQ
How do I setup a project?
Log into your console and create a new logging project. Copy the project name and log stream name after setup
Which Instrumentors do i need to import?
For these frameworks you need to import the following instrumentors:
OpenAI:
from openinference.instrumentation.openai import OpenAIInstrumentor
LLama Index:
from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
DSPy:
from openinference.instrumentation.dspy import DSPyInstrumentor
Bedrock:
from openinference.instrumentation.bedrock import BedrockInstrumentor
Langchain:
from openinference.instrumentation.langchain import LangChainInstrumentor
Mistral:
from openinference.instrumentation.mistralai import MistralAIInstrumentor
Guardrails:
from openinference.instrumentation.guardrails import GuardrailsInstrumentor
Vertex AI:
from openinference.instrumentation.vertexai import VertexAIInstrumentor
CrewAI:
from openinference.instrumentation.crewai import CrewAIInstrumentor
from openinference.instrumentation.langchain import LangChainInstrumentor
Haystack:
from openinference.instrumentation.haystack import HaystackInstrumentor
LiteLLM:
from openinference.instrumentation.litellm import LiteLLMInstrumentor
Groq:
from openinference.instrumentation.groq import GroqInstrumentor
Instructor:
Anthropic:
from openinference.instrumentation.anthropic import AnthropicInstrumentor
How do I get the correct endpoint?
The endpoint is the console URL but instead of console
you replace it with api
and the path is /otel/traces