Module
Galileo wrapper for OpenAI that automatically logs prompts and responses. This module provides a drop-in replacement for the OpenAI library that automatically logs all prompts, responses, and related metadata to Galileo. It works by intercepting calls to the OpenAI API and logging them using the Galileo logging system. Note that the original OpenAI package is still required as a project dependency to use this wrapper.ResponseGeneratorSync
A wrapper for OpenAI streaming responses that logs the response to Galileo. This class wraps the OpenAI streaming response generator and logs the response to Galileo when the generator is exhausted. It implements the iterator protocol to allow for streaming responses. Arguments-
resource(OpenAiModuleDefinition): The OpenAI resource definition. -
response(Generator or openai.Stream): The OpenAI streaming response. -
input_data(OpenAiInputData): The input data for the OpenAI request. -
logger(GalileoLogger): The Galileo logger instance. -
should_complete_trace(bool): Whether to complete the trace when the generator is exhausted.
OpenAIGalileo
This class is responsible for logging OpenAI API calls and logging them to Galileo. It wraps the OpenAI client methods to add logging functionality without changing the original API behavior. Arguments_galileo_logger(Optional[GalileoLogger]): The Galileo logger instance used for logging OpenAI API calls.
initialize
project(Optional[str]): The project to log to. If None, uses the default project.log_stream(Optional[str]): The log stream to log to. If None, uses the default log stream.
Optional[GalileoLogger]: The initialized Galileo logger instance.
register_tracing
- openai.resources.chat.completions.Completions.create