OpenAI Wrapper
The OpenAI wrapper is the simplest way to integrate Galileo logging into your application. By using Galileo’s OpenAI wrapper instead of importing the OpenAI library directly, you can automatically log all prompts, responses, and statistics without any additional code changes.
Installation
First, make sure you have the Galileo SDK installed:
Setup
Create or update a .env
file with your Galileo API key and other optional settings:
Basic Usage
Instead of importing OpenAI directly, import it from Galileo:
This example will automatically produce a single-span trace in the Galileo Logstream UI. The wrapper handles all the logging for you, capturing:
- The input prompt
- The model used
- The response
- Timing information
- Token usage
- Other relevant metadata
Using with Context Manager
For more control over when traces are flushed to Galileo, you can use the galileo_context
context manager:
This ensures that traces are flushed when the context manager exits, which is particularly useful for long-running applications like Streamlit where the request never terminates.
Streaming Support
The OpenAI wrapper also supports streaming responses. When streaming, the wrapper will log the response as it streams in:
Explicit Flushing
In some cases (like long-running processes), it may be necessary to explicitly flush the trace to upload it to Galileo:
Advanced Usage
The OpenAI wrapper is intended to support all the same functionality as the original OpenAI library, including:
- Chat completions
- Text completions
- Embeddings
- Image generation
- Audio transcription and translation
For each of these, the wrapper will automatically log the relevant information to Galileo, making it easy to track and analyze your AI application’s performance.
Combining with the @log Decorator
You can combine the OpenAI wrapper with the @log
decorator to create more complex traces:
Benefits of Using the Wrapper
- Zero-config logging: No need to add logging code throughout your application
- Complete visibility: All prompts and responses are automatically captured
- Minimal code changes: Simply change your import statement
- Automatic tracing: Creates spans and traces without manual setup
- Streaming support: Works with both regular and streaming responses
Related Resources
- @log Decorator - For decorating functions with logging
- GalileoLogger - For more manual control over logging
- galileo_context - For managing trace context and automatic flushing