Adding Annotations
Follow this step-by-step guide to get started using Annotations in your AI projects.
Prerequisites
First, follow the Getting Started with Galileo guide if you don’t already have a Galileo Project set up.
Then, use the following steps to add annotations to your Project.
Step 1: Open Your Galileo App
Open your Galileo application in your code editor.
In this guide, we are using the finished demo application from the Getting Started guide.
Each code snippet in the following steps will be added to the end of the Getting Started demo code.
Step 2: Define Tags and Metadata
Create descriptive tags and metadata to be attached to your logs. Both Spans and Traces can have their own tags and metadata.
Define tags
as a list of relevant labels, and metadata
as a dictionary of label types and their values. The individual tag and metadata values must be strings.
- NOTE : The
answer
variable is set to the raw text output of the model so that it can be used later.
Step 3: Initialize Galileo Logger
Initialize logging by importing and calling the Galileo Logger.
The Project name and Log Stream name are used as inputs to define in which Galileo Project and log stream the logs will be created.
After running our application, the logs will appear in the chosen Project’s log stream in the Galileo Console.
Step 4: Initialize New Trace
Initialize a new Trace to start listening for data to log.
By using the tags
and metadata
inputs, our annotations are attached to the Trace.
Step 5: Create New Span
Create a new Span containing the data created by running the LLM.
By using the tags
and metadata
inputs, our annotations are attached to the Span.
NOTE : In this guide, the tags and metadata used for the Span and the Trace are identical. But, they don’t have to be. You can use different tags and metadata for Spans and the Traces they’re attached to.
Step 6: Close Trace & Push Logs
To close the new Trace and complete the logging session, we use logger.conclude()
with the LLM’s raw text output as the input.
Then, logger.flush()
pushes the logs to the selected Project’s Log Stream.
Step 7: Review Your Code
By adding each previous code snippet to your Galileo application, it is ready to run and create annotated logs.
Below is the final combined application code for the Getting Started demo application, with our annotated log functionality added to it.
Step 8: Run Your Application
Run your Galileo application.
If your application file is named app
(as in the Getting Started demo), you can run it by using the following command in your terminal.
Step 9: Open Project in Galileo Console
In the Galileo Console, select your Project and Log Stream in the top-left corner.
For each time you run your application, you will see a new Trace entry in your Log Stream.
Each metadata key-value pair is displayed in its own column, which can be used for sorting and filtering.
Step 10: View Trace Annotations
Click on an entry in the list.
You will see the data logged to the Trace. This includes:
- The raw text input (because we initialized the Trace with
input=prompt
) - The raw text output from the LLM (because we closed the Trace with
logger.conclude(answer)
) - The tags and metadata we created (found in the Parameters section in the top-right)
Step 11: View Span Annotations
With the Trace open, click the llm
button below Trace
in the data map on the left.
You will see the data logged to the Span. This includes:
- The complete JSON LLM input (because we created the Span with
input=[{"role": "system", "content": prompt}]
) - The complete JSON LLM output (because we created the Span with
output=response.choices[0].message.content
) - The tags and metadata we created (found in the Parameters section in the top-right)
Next Steps:
Continue testing, customizing, and incorporating tags and metadata into your AI project development process and implementation.
Use Annotations to:
- Keep your experimentation process organized.
- Coordinate with your team and track development.
- Track individual steps in your AI project with Traces containing multiple Spans that each have their own distinct annotations.
- Use the tag and metadata values in your code to automate alerts and improvements.