Skip to main content
Galileo supports AWS Bedrock integration via Inference Profiles. This enables the mapping of Galileo-supported model identifiers to an AWS Bedrock Inference Profile ARN, providing greater flexibility and alignment with existing Bedrock configurations. This page explains what inference profiles are, how Galileo integrates with them, and how to configure the integration using a simple setup script.

What are AWS Bedrock Inference Profiles?

An AWS Bedrock Inference Profile is an AWS resource that represents a way to invoke a foundation model (such as Anthropic Claude, Meta Llama, or Mistral) while tracking usage and cost under a named profile in your AWS account.

How Galileo works with Inference Profiles

When you use inference profiles with Galileo:
  • You create an Inference Profile in AWS Bedrock.
  • You create an IAM role in your AWS account that Galileo can assume.
  • You register that role and your Inference Profile ARN with Galileo.
When you run evaluations or prompts in Galileo, Galileo:
  • Invokes Bedrock using your Inference Profile
  • Logs results and metrics back to Galileo
  • Your models, data, and billing remain fully in your AWS account.

Setting up the AWS Bedrock Inference Profile integration

The script below configures the AWS Bedrock integration in Galileo. It does not create AWS resources.

#!/bin/bash
#
# This is an example script for how to set AWS Bedrock integration in Galileo.
# Customize inference_profiles to your needs. The values below are just examples.
# You can use either a model alias or a model name, and map it to an
# inference profile ARN.
#

if [ -z "$GALILEO_API_KEY" ]; then
  echo "Error: GALILEO_API_KEY environment variable is not set"
  exit 1
fi

if [ -z "$AWS_ROLE_ARN" ]; then
  echo "Error: AWS_ROLE_ARN environment variable is not set"
  exit 1
fi

if [ -z "$GALILEO_API_URL" ]; then
  echo "Error: GALILEO_API_URL environment variable is not set"
  exit 1
fi

curl "${GALILEO_API_URL}/integrations/aws_bedrock" \
  -X PUT \
  -H "Galileo-API-Key: ${GALILEO_API_KEY}" \
  -H "content-type: application/json" \
  --data-raw "$(cat <<EOF
{
  "credential_type": "assumed_role",
  "region": "us-east-1",
  "token": {
    "aws_role_arn": "${AWS_ROLE_ARN}"
  },
  "inference_profiles": {
    "anthropic.claude-3-sonnet-20240229-v1:0":
      "arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/my-sonnet-profile"
  }
}
EOF
)"

Supported models

Galileo supports all major Amazon Bedrock foundation models, including:
  • Anthropic Claude (3, 3.5, 4, Sonnet, Haiku, Opus)
  • Meta Llama (3, 3.1, 3.2)
  • Mistral and Mixtral
  • Amazon Titan
  • Cohere Command

Requesting a demo or help with setup

If you’d like to see a demo of the AWS Bedrock Inference Profile Integration or need help setting it up, please Request a Demo