Skip to main content
Galileo supports logging traces from Microsoft Agent Framework applications using OpenTelemetry. The framework has built-in OTel instrumentation, so no extra instrumentation package is needed.

Set up OpenTelemetry

To log Microsoft Agent Framework traces using Galileo, the first step is to set up OpenTelemetry.
1

Installation

Add the OpenTelemetry packages to your project:
pip install opentelemetry-api opentelemetry-sdk \
            opentelemetry-exporter-otlp
The opentelemetry-api and opentelemetry-sdk packages provide the core OpenTelemetry functionality. The opentelemetry-exporter-otlp package enables sending traces to Galileo’s OTLP endpoint.
2

Create environment variables for your Galileo settings

Set environment variables for your Galileo settings, for example in a .env file. These environment variables are consumed by the GalileoSpanProcessorto authenticate and route traces to the correct Galileo Project and Log stream:
# Your Galileo API key
GALILEO_API_KEY="your-galileo-api-key"

# Your Galileo project name
GALILEO_PROJECT="your-galileo-project-name"

# The name of the Log stream you want to use for logging
GALILEO_LOG_STREAM="your-galileo-log-stream "

# Provide the console url below if you are using a
# custom deployment, and not using the free tier, or app.galileo.ai.
# This will look something like “console.galileo.yourcompany.com”.
# GALILEO_CONSOLE_URL="your-galileo-console-url"
3

Self hosted deployments: Set the OTel endpoint

Skip this step if you are using Galileo Cloud.
The OTel endpoint is different from Galileo’s regular API endpoint and is specifically designed to receive telemetry data in the OTLP format.If you are using:
  • Galileo Cloud at app.galileo.ai, then you don’t need to provide a custom OTel endpoint. The default endpoint https://api.galileo.ai/otel/traces will be used automatically.
  • A self-hosted Galileo deployment, replace the https://api.galileo.ai/otel/traces endpoint with your deployment URL. The format of this URL is based on your console URL, replacing console with api and appending /otel/traces.
For example:
  • if your console URL is https://console.galileo.example.com, the OTel endpoint would be https://api.galileo.example.com/otel/traces
  • if your console URL is https://console-galileo.apps.mycompany.com, the OTel endpoint would be https://api-galileo.apps.mycompany.com/otel/traces
The convention is to store this in the OTEL_EXPORTER_OTLP_ENDPOINT environment variable. For example:
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = \
    "https://api.galileo.ai/otel/traces"
4

Initialize and create the Galileo span processor

The GalileoSpanProcessor automatically configures authentication and metadata using your environment variables. It also:
  • Auto-builds OTLP headers using your Galileo credentials
  • Configures the correct OTLP trace endpoint
  • Registers a batch span processor that exports traces to Galileo
from galileo import otel  

# GalileoSpanProcessor (no manual OTLP config required) loads the env vars for 
# the Galileo API key, Project, and Log stream. Make sure to set them first. 
galileo_span_processor = otel.GalileoSpanProcessor(
    # Optional parameters if not set, uses env var
    # project=os.environ["GALILEO_PROJECT"], 
    # logstream=os.environ.get("GALILEO_LOG_STREAM"),  
)

Log a Microsoft Agent Framework agent using OpenTelemetry

Once OpenTelemetry is configured, you can use the GalileoSpanProcessor to capture traces from your agent.
1

Create the tracer provider with the Galileo span processor

Set up a TracerProvider with the GalileoSpanProcessor and register it as the global tracer provider:
from galileo.otel import GalileoSpanProcessor, add_galileo_span_processor
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider

tracer_provider = TracerProvider()
galileo_processor = GalileoSpanProcessor()
add_galileo_span_processor(tracer_provider, galileo_processor)
trace.set_tracer_provider(tracer_provider)
2

Enable the framework's instrumentation

Enable the Microsoft Agent Framework’s built-in OTel instrumentation. Set enable_sensitive_data=True to send LLM inputs and outputs to Galileo. If set to False, only span metadata (timing, token counts, etc.) will be sent.
from agent_framework.observability import enable_instrumentation

enable_instrumentation(enable_sensitive_data=True)
When you run your agent code, traces will be logged to Galileo.

Full example

Here is a full example based on the Microsoft Agent Framework’s tool calling sample. You can find this project in the Galileo SDK examples repo. To run this example, create a .env file with the following values set, or set them as environment variables:
.env
# OpenAI environment variables
OPENAI_API_KEY=your-openai-api-key

# Galileo environment variables
GALILEO_API_KEY=your-galileo-api-key
GALILEO_PROJECT=your-galileo-project
GALILEO_LOG_STREAM=your-log-stream
Remember to update these to match your Galileo API key, project name, and Log stream name.
from random import randint
from typing import Annotated

from agent_framework import openai, tool
from agent_framework.observability import enable_instrumentation
from galileo.otel import GalileoSpanProcessor, add_galileo_span_processor
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from pydantic import Field

# Set up the OTel tracer provider with the Galileo span processor
tracer_provider = TracerProvider()
galileo_processor = GalileoSpanProcessor()
add_galileo_span_processor(tracer_provider, galileo_processor)
trace.set_tracer_provider(tracer_provider)

# Enable the Microsoft Agent Framework's built-in OTel instrumentation.
# Set enable_sensitive_data=True to send LLM inputs and outputs to Galileo.
# If set to False, only span metadata (timing, token counts, etc.) will be sent.
enable_instrumentation(enable_sensitive_data=True)


@tool(approval_mode="never_require")
def get_weather(
    location: Annotated[
        str, Field(description="The location to get the weather for.")
    ],
) -> str:
    """Get the weather for a given location."""
    conditions = ["sunny", "cloudy", "rainy", "stormy"]
    temp = randint(10, 30)
    condition = conditions[randint(0, 3)]
    return f"The weather in {location} is {condition}, {temp}C."


client = openai.OpenAIChatClient(model_id="gpt-4.1-mini")

agent = client.as_agent(
    name="WeatherAgent",
    instructions="You are a helpful weather agent. "
    "Use the get_weather tool to answer questions.",
    tools=[get_weather],
)


async def main():
    result = await agent.run("What's the weather like in Seattle?")
    print(result)


if __name__ == "__main__":
    import asyncio

    asyncio.run(main())