openai
Galileo wrapper for OpenAI that automatically logs prompts and responses.
This module provides a drop-in replacement for the OpenAI library that automatically logs all prompts, responses, and related metadata to Galileo. It works by intercepting calls to the OpenAI API and logging them using the Galileo logging system.
Note that the original OpenAI package is still required as a project dependency to use this wrapper.
ResponseGeneratorSync Objects
A wrapper for OpenAI streaming responses that logs the response to Galileo.
This class wraps the OpenAI streaming response generator and logs the response to Galileo when the generator is exhausted. It implements the iterator protocol to allow for streaming responses.
Arguments:
resource
(OpenAiModuleDefinition
): The OpenAI resource definition.response
(Generator or openai.Stream
): The OpenAI streaming response.input_data
(OpenAiInputData
): The input data for the OpenAI request.logger
(GalileoLogger
): The Galileo logger instance.should_complete_trace
(bool
): Whether to complete the trace when the generator is exhausted.
OpenAIGalileo Objects
This class is responsible for logging OpenAI API calls and logging them to Galileo.
It wraps the OpenAI client methods to add logging functionality without changing the original API behavior.
Arguments:
_galileo_logger
(Optional[GalileoLogger]
): The Galileo logger instance used for logging OpenAI API calls.
initialize
Initialize a Galileo logger.
Arguments:
project
(Optional[str]
): The project to log to. If None, uses the default project.log_stream
(Optional[str]
): The log stream to log to. If None, uses the default log stream.
Returns:
Optional[GalileoLogger]
: The initialized Galileo logger instance.
register_tracing
This method wraps the OpenAI client methods to intercept calls and log them to Galileo.
It is called automatically when the module is imported.
The wrapped methods include:
- openai.resources.chat.completions.Completions.create
Additional methods can be added to the OPENAI_CLIENT_METHODS list.