What are AWS Bedrock Inference Profiles?
An AWS Bedrock Inference Profile is an AWS resource that represents a way to invoke a foundation model (such as Anthropic Claude, Meta Llama, or Mistral) while tracking usage and cost under a named profile in your AWS account.How Galileo works with Inference Profiles
When you use inference profiles with Galileo:- You create an Inference Profile in AWS Bedrock.
- You create an IAM role in your AWS account that Galileo can assume.
- You register that role and your Inference Profile ARN with Galileo.
- Invokes Bedrock using your Inference Profile
- Logs results and metrics back to Galileo
- Your models, data, and billing remain fully in your AWS account.
Setting up the AWS Bedrock Inference Profile integration
The script below configures the AWS Bedrock integration in Galileo. It does not create AWS resources.Supported models
Galileo supports all major Amazon Bedrock foundation models, including:- Anthropic Claude (3, 3.5, 4, Sonnet, Haiku, Opus)
- Meta Llama (3, 3.1, 3.2)
- Mistral and Mixtral
- Amazon Titan
- Cohere Command