Log Streams in Galileo are organizational units that group related logs together based on logical deployments, applications, or environments. They provide a structured way to categorize and manage logs across your AI workflows, making it easier to monitor, analyze, and debug specific segments of your application ecosystem.Log Streams serve as containers for logs, allowing you to:
Separate logs from different environments (development, staging, production)
Isolate logs from different applications or services
Group logs by feature, team, or business unit
Organize logs for specific experiments or testing scenarios
To monitor your application, you configure the metrics that you want to be evaluated for each trace and session in the Log stream.
Log Streams play a crucial role in organizing your logging infrastructure for several reasons:
Organization: Log Streams provide a clean separation between different parts of your application ecosystem, preventing logs from becoming mixed and difficult to analyze.
Environment Isolation: They allow you to keep logs from development, staging, and production environments separate, making it easier to focus on relevant data.
Lifecycle Management: Different Log Streams can have different retention policies, allowing you to keep critical production logs longer while purging development logs more frequently.
Log Streams contribute to effective log management in several ways:
Debugging: When issues arise, you can focus on logs from the specific environment or application where the problem occurred, reducing noise and speeding up troubleshooting.
Monitoring: Log Streams enable targeted monitoring of specific applications or environments, making it easier to set up relevant alerts and dashboards.
Analysis: By organizing logs into logical groups, Log Streams facilitate more focused analysis and pattern recognition within specific contexts.
You can also create Log Streams programmatically using the Galileo SDK:
Copy
Ask AI
from galileo import GalileoLogger# Initialize with a new Log stream (will be created if it doesn't exist)logger = GalileoLogger(project="my-project", log_stream="new-log-stream")
Here’s a simple example of creating and using a Log Stream with the Galileo SDK:
Copy
Ask AI
import osfrom galileo import GalileoLoggerfrom galileo.openai import openai# Initialize with a specific Log streamlogger = GalileoLogger(project="my-project", log_stream="production-app")# Any logs created will go to the specified Log streamclient = openai.OpenAI(api_key=os.environ["OPENAI_API_KEY"])response = client.chat.completions.create(model="gpt-4o",messages=[{"role": "user", "content": "Tell me about AI"}])
Here’s an example of managing logs across different environments:
Copy
Ask AI
import osfrom galileo import GalileoLoggerfrom galileo.openai import openai# Determine environment from environment variableenv = os.getenv("APP_ENVIRONMENT", "development")# Create a logger with an environment-specific Log streamlogger = GalileoLogger( project="my-ai-app", log_stream=f"my-ai-app-{env}")# All logs will now go to the environment-specific stream
You can use tags and metadata to further organize logs within a Log Stream:
Copy
Ask AI
from galileo import GalileoLogger, galileo_context# Initialize the base loggerlogger = GalileoLogger(project="my-project", log_stream="production")# Use context to add tags for a specific featurewith galileo_context(tags=["recommendation-engine", "v2.3"]): # All logs in this context will have the specified tags # making them easier to filter within the Log streamresult = recommendation_engine.get_recommendations(user_id)