|
|
1 nedēļu atpakaļ | |
|---|---|---|
| .. | ||
| README.md | 1 nedēļu atpakaļ | |
| main.tf | 1 nedēļu atpakaļ | |
| outputs.tf | 1 nedēļu atpakaļ | |
| terraform.tfvars.example | 1 nedēļu atpakaļ | |
| variables.tf | 1 nedēļu atpakaļ | |
Deploy Llama 4 Scout models using Amazon Bedrock managed service.
This Terraform configuration sets up a basic example deployment, demonstrating how to deploy/serve Amazon Bedrock foundation models in Amazon Web Services. Amazon Bedrock provides fully managed AI models without any infrastructure management.
This example shows how to use basic services such as:
In our architecture patterns for private cloud guide we outline advanced patterns for cloud deployment that you may choose to implement in a more complete deployment. This includes:
Configure AWS credentials:
aws configure
Edit terraform.tfvars with your values.
Create configuration:
cd terraform/amazon-bedrock-default
cp terraform.tfvars.example terraform.tfvars
Deploy:
terraform init
terraform plan
terraform apply
import boto3
import json
bedrock = boto3.client('bedrock-runtime', region_name='us-east-1')
response = bedrock.invoke_model(
modelId='meta.llama4-scout-17b-instruct-v1:0',
body=json.dumps({
"prompt": "Hello, how are you?",
"max_gen_len": 256,
"temperature": 0.7
})
)