Galileo provides wrappers for popular LLM providers and frameworks to make it easy to integrate logging into your existing applications. These wrappers automatically capture LLM interactions and log them to Galileo.

Available Wrappers

  • OpenAI: Wrap the OpenAI client to automatically log all LLM interactions
  • Anthropic: Coming soon

Usage

Import the wrapper for your LLM provider and wrap your client instance:

import { OpenAI } from "openai";
import { wrapOpenAI } from "galileo";

const openai = wrapOpenAI(new OpenAI({ apiKey: process.env.OPENAI_API_KEY }));

// Use the wrapped client as you normally would
async function callOpenAI() {
  const response = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [{ content: "Say hello world!", role: "user" }],
  });
  
  return response;
}

// Call the function
callOpenAI(); 

Example with Logging

import { OpenAI } from "openai";
import { wrapOpenAI, flush, init } from "galileo";

// Initialize Galileo
init({
  projectName: "my-project",
  logStreamName: "development"
});

const openai = wrapOpenAI(new OpenAI({ apiKey: process.env.OPENAI_API_KEY }));

async function callOpenAI() {
  const response = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [{ content: "Say hello world!", role: "user" }],
  });
  
  // Flush logs before exiting
  await flush();
  
  return response;
}

// Call the function
callOpenAI();